May 27 03:19:00.863108 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 27 01:09:43 -00 2025 May 27 03:19:00.863136 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f6c186658a19d5a08471ef76df75f82494b37b46908f9237b2c3cf497da860c6 May 27 03:19:00.863148 kernel: BIOS-provided physical RAM map: May 27 03:19:00.863155 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000002ffff] usable May 27 03:19:00.863161 kernel: BIOS-e820: [mem 0x0000000000030000-0x000000000004ffff] reserved May 27 03:19:00.863168 kernel: BIOS-e820: [mem 0x0000000000050000-0x000000000009efff] usable May 27 03:19:00.863175 kernel: BIOS-e820: [mem 0x000000000009f000-0x000000000009ffff] reserved May 27 03:19:00.863182 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009b8ecfff] usable May 27 03:19:00.863192 kernel: BIOS-e820: [mem 0x000000009b8ed000-0x000000009bb6cfff] reserved May 27 03:19:00.863198 kernel: BIOS-e820: [mem 0x000000009bb6d000-0x000000009bb7efff] ACPI data May 27 03:19:00.863205 kernel: BIOS-e820: [mem 0x000000009bb7f000-0x000000009bbfefff] ACPI NVS May 27 03:19:00.863214 kernel: BIOS-e820: [mem 0x000000009bbff000-0x000000009bfb0fff] usable May 27 03:19:00.863220 kernel: BIOS-e820: [mem 0x000000009bfb1000-0x000000009bfb4fff] reserved May 27 03:19:00.863227 kernel: BIOS-e820: [mem 0x000000009bfb5000-0x000000009bfb6fff] ACPI NVS May 27 03:19:00.863235 kernel: BIOS-e820: [mem 0x000000009bfb7000-0x000000009bffffff] usable May 27 03:19:00.863242 kernel: BIOS-e820: [mem 0x000000009c000000-0x000000009cffffff] reserved May 27 03:19:00.863254 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 27 03:19:00.863261 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 27 03:19:00.863268 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 27 03:19:00.863275 kernel: NX (Execute Disable) protection: active May 27 03:19:00.863282 kernel: APIC: Static calls initialized May 27 03:19:00.863289 kernel: e820: update [mem 0x9a13f018-0x9a148c57] usable ==> usable May 27 03:19:00.863297 kernel: e820: update [mem 0x9a102018-0x9a13ee57] usable ==> usable May 27 03:19:00.863304 kernel: extended physical RAM map: May 27 03:19:00.863311 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000002ffff] usable May 27 03:19:00.863318 kernel: reserve setup_data: [mem 0x0000000000030000-0x000000000004ffff] reserved May 27 03:19:00.863325 kernel: reserve setup_data: [mem 0x0000000000050000-0x000000000009efff] usable May 27 03:19:00.863334 kernel: reserve setup_data: [mem 0x000000000009f000-0x000000000009ffff] reserved May 27 03:19:00.863342 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000009a102017] usable May 27 03:19:00.863349 kernel: reserve setup_data: [mem 0x000000009a102018-0x000000009a13ee57] usable May 27 03:19:00.863356 kernel: reserve setup_data: [mem 0x000000009a13ee58-0x000000009a13f017] usable May 27 03:19:00.863363 kernel: reserve setup_data: [mem 0x000000009a13f018-0x000000009a148c57] usable May 27 03:19:00.863370 kernel: reserve setup_data: [mem 0x000000009a148c58-0x000000009b8ecfff] usable May 27 03:19:00.863377 kernel: reserve setup_data: [mem 0x000000009b8ed000-0x000000009bb6cfff] reserved May 27 03:19:00.863384 kernel: reserve setup_data: [mem 0x000000009bb6d000-0x000000009bb7efff] ACPI data May 27 03:19:00.863391 kernel: reserve setup_data: [mem 0x000000009bb7f000-0x000000009bbfefff] ACPI NVS May 27 03:19:00.863398 kernel: reserve setup_data: [mem 0x000000009bbff000-0x000000009bfb0fff] usable May 27 03:19:00.863405 kernel: reserve setup_data: [mem 0x000000009bfb1000-0x000000009bfb4fff] reserved May 27 03:19:00.863414 kernel: reserve setup_data: [mem 0x000000009bfb5000-0x000000009bfb6fff] ACPI NVS May 27 03:19:00.863421 kernel: reserve setup_data: [mem 0x000000009bfb7000-0x000000009bffffff] usable May 27 03:19:00.863432 kernel: reserve setup_data: [mem 0x000000009c000000-0x000000009cffffff] reserved May 27 03:19:00.863439 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 27 03:19:00.863446 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 27 03:19:00.863454 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 27 03:19:00.863463 kernel: efi: EFI v2.7 by EDK II May 27 03:19:00.863470 kernel: efi: SMBIOS=0x9b9d5000 ACPI=0x9bb7e000 ACPI 2.0=0x9bb7e014 MEMATTR=0x9a1af018 RNG=0x9bb73018 May 27 03:19:00.863478 kernel: random: crng init done May 27 03:19:00.863485 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 May 27 03:19:00.863492 kernel: secureboot: Secure boot enabled May 27 03:19:00.863499 kernel: SMBIOS 2.8 present. May 27 03:19:00.863507 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 May 27 03:19:00.863514 kernel: DMI: Memory slots populated: 1/1 May 27 03:19:00.863521 kernel: Hypervisor detected: KVM May 27 03:19:00.863529 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 27 03:19:00.863536 kernel: kvm-clock: using sched offset of 6314672832 cycles May 27 03:19:00.863546 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 27 03:19:00.863554 kernel: tsc: Detected 2794.748 MHz processor May 27 03:19:00.863561 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 27 03:19:00.863569 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 27 03:19:00.863576 kernel: last_pfn = 0x9c000 max_arch_pfn = 0x400000000 May 27 03:19:00.863584 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 27 03:19:00.863596 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 27 03:19:00.863603 kernel: Using GB pages for direct mapping May 27 03:19:00.863613 kernel: ACPI: Early table checksum verification disabled May 27 03:19:00.863622 kernel: ACPI: RSDP 0x000000009BB7E014 000024 (v02 BOCHS ) May 27 03:19:00.863630 kernel: ACPI: XSDT 0x000000009BB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) May 27 03:19:00.863638 kernel: ACPI: FACP 0x000000009BB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 27 03:19:00.863645 kernel: ACPI: DSDT 0x000000009BB7A000 002237 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 03:19:00.863652 kernel: ACPI: FACS 0x000000009BBDD000 000040 May 27 03:19:00.863660 kernel: ACPI: APIC 0x000000009BB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 03:19:00.863667 kernel: ACPI: HPET 0x000000009BB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 03:19:00.863675 kernel: ACPI: MCFG 0x000000009BB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 03:19:00.863682 kernel: ACPI: WAET 0x000000009BB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 03:19:00.863692 kernel: ACPI: BGRT 0x000000009BB74000 000038 (v01 INTEL EDK2 00000002 01000013) May 27 03:19:00.863700 kernel: ACPI: Reserving FACP table memory at [mem 0x9bb79000-0x9bb790f3] May 27 03:19:00.863707 kernel: ACPI: Reserving DSDT table memory at [mem 0x9bb7a000-0x9bb7c236] May 27 03:19:00.863714 kernel: ACPI: Reserving FACS table memory at [mem 0x9bbdd000-0x9bbdd03f] May 27 03:19:00.863722 kernel: ACPI: Reserving APIC table memory at [mem 0x9bb78000-0x9bb7808f] May 27 03:19:00.863729 kernel: ACPI: Reserving HPET table memory at [mem 0x9bb77000-0x9bb77037] May 27 03:19:00.863736 kernel: ACPI: Reserving MCFG table memory at [mem 0x9bb76000-0x9bb7603b] May 27 03:19:00.863744 kernel: ACPI: Reserving WAET table memory at [mem 0x9bb75000-0x9bb75027] May 27 03:19:00.863751 kernel: ACPI: Reserving BGRT table memory at [mem 0x9bb74000-0x9bb74037] May 27 03:19:00.863761 kernel: No NUMA configuration found May 27 03:19:00.863768 kernel: Faking a node at [mem 0x0000000000000000-0x000000009bffffff] May 27 03:19:00.863776 kernel: NODE_DATA(0) allocated [mem 0x9bf57dc0-0x9bf5efff] May 27 03:19:00.863783 kernel: Zone ranges: May 27 03:19:00.863791 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 27 03:19:00.863799 kernel: DMA32 [mem 0x0000000001000000-0x000000009bffffff] May 27 03:19:00.863806 kernel: Normal empty May 27 03:19:00.863813 kernel: Device empty May 27 03:19:00.863821 kernel: Movable zone start for each node May 27 03:19:00.863830 kernel: Early memory node ranges May 27 03:19:00.863838 kernel: node 0: [mem 0x0000000000001000-0x000000000002ffff] May 27 03:19:00.863845 kernel: node 0: [mem 0x0000000000050000-0x000000000009efff] May 27 03:19:00.863853 kernel: node 0: [mem 0x0000000000100000-0x000000009b8ecfff] May 27 03:19:00.863860 kernel: node 0: [mem 0x000000009bbff000-0x000000009bfb0fff] May 27 03:19:00.863867 kernel: node 0: [mem 0x000000009bfb7000-0x000000009bffffff] May 27 03:19:00.863875 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009bffffff] May 27 03:19:00.863882 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 27 03:19:00.863890 kernel: On node 0, zone DMA: 32 pages in unavailable ranges May 27 03:19:00.863897 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 27 03:19:00.863907 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges May 27 03:19:00.863921 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges May 27 03:19:00.863929 kernel: On node 0, zone DMA32: 16384 pages in unavailable ranges May 27 03:19:00.863936 kernel: ACPI: PM-Timer IO Port: 0x608 May 27 03:19:00.863943 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 27 03:19:00.863964 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 27 03:19:00.863972 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 27 03:19:00.863979 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 27 03:19:00.863989 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 27 03:19:00.864000 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 27 03:19:00.864008 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 27 03:19:00.864015 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 27 03:19:00.864023 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 27 03:19:00.864030 kernel: TSC deadline timer available May 27 03:19:00.864037 kernel: CPU topo: Max. logical packages: 1 May 27 03:19:00.864045 kernel: CPU topo: Max. logical dies: 1 May 27 03:19:00.864052 kernel: CPU topo: Max. dies per package: 1 May 27 03:19:00.864068 kernel: CPU topo: Max. threads per core: 1 May 27 03:19:00.864076 kernel: CPU topo: Num. cores per package: 4 May 27 03:19:00.864084 kernel: CPU topo: Num. threads per package: 4 May 27 03:19:00.864091 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs May 27 03:19:00.864103 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 27 03:19:00.864111 kernel: kvm-guest: KVM setup pv remote TLB flush May 27 03:19:00.864119 kernel: kvm-guest: setup PV sched yield May 27 03:19:00.864127 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices May 27 03:19:00.864134 kernel: Booting paravirtualized kernel on KVM May 27 03:19:00.864144 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 27 03:19:00.864152 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 May 27 03:19:00.864160 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 May 27 03:19:00.864168 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 May 27 03:19:00.864175 kernel: pcpu-alloc: [0] 0 1 2 3 May 27 03:19:00.864183 kernel: kvm-guest: PV spinlocks enabled May 27 03:19:00.864191 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 27 03:19:00.864200 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f6c186658a19d5a08471ef76df75f82494b37b46908f9237b2c3cf497da860c6 May 27 03:19:00.864210 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 03:19:00.864218 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 03:19:00.864226 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 03:19:00.864233 kernel: Fallback order for Node 0: 0 May 27 03:19:00.864241 kernel: Built 1 zonelists, mobility grouping on. Total pages: 638054 May 27 03:19:00.864249 kernel: Policy zone: DMA32 May 27 03:19:00.864257 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 03:19:00.864264 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 27 03:19:00.864272 kernel: ftrace: allocating 40081 entries in 157 pages May 27 03:19:00.864282 kernel: ftrace: allocated 157 pages with 5 groups May 27 03:19:00.864290 kernel: Dynamic Preempt: voluntary May 27 03:19:00.864297 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 03:19:00.864306 kernel: rcu: RCU event tracing is enabled. May 27 03:19:00.864314 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 27 03:19:00.864322 kernel: Trampoline variant of Tasks RCU enabled. May 27 03:19:00.864329 kernel: Rude variant of Tasks RCU enabled. May 27 03:19:00.864337 kernel: Tracing variant of Tasks RCU enabled. May 27 03:19:00.864345 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 03:19:00.864355 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 27 03:19:00.864363 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 27 03:19:00.864371 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 27 03:19:00.864381 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 27 03:19:00.864389 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 May 27 03:19:00.864397 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 03:19:00.864405 kernel: Console: colour dummy device 80x25 May 27 03:19:00.864413 kernel: printk: legacy console [ttyS0] enabled May 27 03:19:00.864420 kernel: ACPI: Core revision 20240827 May 27 03:19:00.864430 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 27 03:19:00.864438 kernel: APIC: Switch to symmetric I/O mode setup May 27 03:19:00.864446 kernel: x2apic enabled May 27 03:19:00.864454 kernel: APIC: Switched APIC routing to: physical x2apic May 27 03:19:00.864462 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() May 27 03:19:00.864470 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() May 27 03:19:00.864477 kernel: kvm-guest: setup PV IPIs May 27 03:19:00.864485 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 27 03:19:00.864493 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns May 27 03:19:00.864503 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) May 27 03:19:00.864511 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 27 03:19:00.864519 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 27 03:19:00.864526 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 27 03:19:00.864537 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 27 03:19:00.864544 kernel: Spectre V2 : Mitigation: Retpolines May 27 03:19:00.864552 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 27 03:19:00.864560 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 27 03:19:00.864570 kernel: RETBleed: Mitigation: untrained return thunk May 27 03:19:00.864578 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 27 03:19:00.864586 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 27 03:19:00.864594 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! May 27 03:19:00.864602 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. May 27 03:19:00.864610 kernel: x86/bugs: return thunk changed May 27 03:19:00.864617 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode May 27 03:19:00.864625 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 27 03:19:00.864633 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 27 03:19:00.864643 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 27 03:19:00.864651 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 27 03:19:00.864659 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 27 03:19:00.864666 kernel: Freeing SMP alternatives memory: 32K May 27 03:19:00.864674 kernel: pid_max: default: 32768 minimum: 301 May 27 03:19:00.864682 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 03:19:00.864689 kernel: landlock: Up and running. May 27 03:19:00.864697 kernel: SELinux: Initializing. May 27 03:19:00.864705 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 03:19:00.864715 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 03:19:00.864723 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) May 27 03:19:00.864731 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 27 03:19:00.864738 kernel: ... version: 0 May 27 03:19:00.864746 kernel: ... bit width: 48 May 27 03:19:00.864756 kernel: ... generic registers: 6 May 27 03:19:00.864764 kernel: ... value mask: 0000ffffffffffff May 27 03:19:00.864772 kernel: ... max period: 00007fffffffffff May 27 03:19:00.864779 kernel: ... fixed-purpose events: 0 May 27 03:19:00.864789 kernel: ... event mask: 000000000000003f May 27 03:19:00.864797 kernel: signal: max sigframe size: 1776 May 27 03:19:00.864805 kernel: rcu: Hierarchical SRCU implementation. May 27 03:19:00.864813 kernel: rcu: Max phase no-delay instances is 400. May 27 03:19:00.864820 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 03:19:00.864828 kernel: smp: Bringing up secondary CPUs ... May 27 03:19:00.864836 kernel: smpboot: x86: Booting SMP configuration: May 27 03:19:00.864843 kernel: .... node #0, CPUs: #1 #2 #3 May 27 03:19:00.864851 kernel: smp: Brought up 1 node, 4 CPUs May 27 03:19:00.864861 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) May 27 03:19:00.864869 kernel: Memory: 2409212K/2552216K available (14336K kernel code, 2430K rwdata, 9952K rodata, 54416K init, 2552K bss, 137068K reserved, 0K cma-reserved) May 27 03:19:00.864877 kernel: devtmpfs: initialized May 27 03:19:00.864885 kernel: x86/mm: Memory block size: 128MB May 27 03:19:00.864892 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9bb7f000-0x9bbfefff] (524288 bytes) May 27 03:19:00.864900 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9bfb5000-0x9bfb6fff] (8192 bytes) May 27 03:19:00.864908 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 03:19:00.864923 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 27 03:19:00.864930 kernel: pinctrl core: initialized pinctrl subsystem May 27 03:19:00.864940 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 03:19:00.864960 kernel: audit: initializing netlink subsys (disabled) May 27 03:19:00.864969 kernel: audit: type=2000 audit(1748315937.791:1): state=initialized audit_enabled=0 res=1 May 27 03:19:00.864977 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 03:19:00.864984 kernel: thermal_sys: Registered thermal governor 'user_space' May 27 03:19:00.864992 kernel: cpuidle: using governor menu May 27 03:19:00.865000 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 03:19:00.865007 kernel: dca service started, version 1.12.1 May 27 03:19:00.865015 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] May 27 03:19:00.865028 kernel: PCI: Using configuration type 1 for base access May 27 03:19:00.865036 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 27 03:19:00.865044 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 03:19:00.865052 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 27 03:19:00.865059 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 03:19:00.865067 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 27 03:19:00.865075 kernel: ACPI: Added _OSI(Module Device) May 27 03:19:00.865083 kernel: ACPI: Added _OSI(Processor Device) May 27 03:19:00.865090 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 03:19:00.865101 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 03:19:00.865109 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 03:19:00.865116 kernel: ACPI: Interpreter enabled May 27 03:19:00.865124 kernel: ACPI: PM: (supports S0 S5) May 27 03:19:00.865132 kernel: ACPI: Using IOAPIC for interrupt routing May 27 03:19:00.865140 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 27 03:19:00.865147 kernel: PCI: Using E820 reservations for host bridge windows May 27 03:19:00.865155 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 27 03:19:00.865163 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 27 03:19:00.865371 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:19:00.865499 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 27 03:19:00.865619 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 27 03:19:00.865629 kernel: PCI host bridge to bus 0000:00 May 27 03:19:00.865802 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 27 03:19:00.865997 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 27 03:19:00.866124 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 27 03:19:00.866234 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] May 27 03:19:00.866344 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] May 27 03:19:00.866457 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] May 27 03:19:00.866567 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 27 03:19:00.866731 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 27 03:19:00.866875 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint May 27 03:19:00.867073 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] May 27 03:19:00.867246 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] May 27 03:19:00.867405 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] May 27 03:19:00.867530 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 27 03:19:00.867672 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint May 27 03:19:00.867797 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] May 27 03:19:00.867933 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] May 27 03:19:00.868076 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] May 27 03:19:00.868220 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 27 03:19:00.868343 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] May 27 03:19:00.868514 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] May 27 03:19:00.868675 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] May 27 03:19:00.868816 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 27 03:19:00.868969 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] May 27 03:19:00.869095 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] May 27 03:19:00.869220 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] May 27 03:19:00.869340 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] May 27 03:19:00.869522 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 27 03:19:00.869646 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 27 03:19:00.869842 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 27 03:19:00.870001 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] May 27 03:19:00.870122 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] May 27 03:19:00.870259 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 27 03:19:00.870381 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] May 27 03:19:00.870392 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 27 03:19:00.870400 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 27 03:19:00.870408 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 27 03:19:00.870420 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 27 03:19:00.870428 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 27 03:19:00.870435 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 27 03:19:00.870443 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 27 03:19:00.870451 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 27 03:19:00.870459 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 27 03:19:00.870467 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 27 03:19:00.870474 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 27 03:19:00.870482 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 27 03:19:00.870492 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 27 03:19:00.870500 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 27 03:19:00.870508 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 27 03:19:00.870515 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 27 03:19:00.870523 kernel: iommu: Default domain type: Translated May 27 03:19:00.870531 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 27 03:19:00.870539 kernel: efivars: Registered efivars operations May 27 03:19:00.870546 kernel: PCI: Using ACPI for IRQ routing May 27 03:19:00.870554 kernel: PCI: pci_cache_line_size set to 64 bytes May 27 03:19:00.870564 kernel: e820: reserve RAM buffer [mem 0x0009f000-0x0009ffff] May 27 03:19:00.870572 kernel: e820: reserve RAM buffer [mem 0x9a102018-0x9bffffff] May 27 03:19:00.870579 kernel: e820: reserve RAM buffer [mem 0x9a13f018-0x9bffffff] May 27 03:19:00.870587 kernel: e820: reserve RAM buffer [mem 0x9b8ed000-0x9bffffff] May 27 03:19:00.870594 kernel: e820: reserve RAM buffer [mem 0x9bfb1000-0x9bffffff] May 27 03:19:00.870715 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 27 03:19:00.870834 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 27 03:19:00.870998 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 27 03:19:00.871014 kernel: vgaarb: loaded May 27 03:19:00.871022 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 27 03:19:00.871030 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 27 03:19:00.871038 kernel: clocksource: Switched to clocksource kvm-clock May 27 03:19:00.871046 kernel: VFS: Disk quotas dquot_6.6.0 May 27 03:19:00.871054 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 03:19:00.871062 kernel: pnp: PnP ACPI init May 27 03:19:00.871205 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved May 27 03:19:00.871217 kernel: pnp: PnP ACPI: found 6 devices May 27 03:19:00.871228 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 27 03:19:00.871236 kernel: NET: Registered PF_INET protocol family May 27 03:19:00.871244 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 27 03:19:00.871252 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 27 03:19:00.871260 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 03:19:00.871268 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 03:19:00.871276 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 27 03:19:00.871283 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 27 03:19:00.871291 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 03:19:00.871301 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 03:19:00.871309 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 03:19:00.871316 kernel: NET: Registered PF_XDP protocol family May 27 03:19:00.871441 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window May 27 03:19:00.871565 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned May 27 03:19:00.871687 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 27 03:19:00.871800 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 27 03:19:00.871910 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 27 03:19:00.872054 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] May 27 03:19:00.872165 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] May 27 03:19:00.872274 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] May 27 03:19:00.872285 kernel: PCI: CLS 0 bytes, default 64 May 27 03:19:00.872293 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns May 27 03:19:00.872301 kernel: Initialise system trusted keyrings May 27 03:19:00.872309 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 27 03:19:00.872316 kernel: Key type asymmetric registered May 27 03:19:00.872328 kernel: Asymmetric key parser 'x509' registered May 27 03:19:00.872349 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 27 03:19:00.872359 kernel: io scheduler mq-deadline registered May 27 03:19:00.872367 kernel: io scheduler kyber registered May 27 03:19:00.872375 kernel: io scheduler bfq registered May 27 03:19:00.872383 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 27 03:19:00.872392 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 27 03:19:00.872400 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 27 03:19:00.872408 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 27 03:19:00.872418 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 03:19:00.872426 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 27 03:19:00.872434 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 27 03:19:00.872442 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 27 03:19:00.872450 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 27 03:19:00.872586 kernel: rtc_cmos 00:04: RTC can wake from S4 May 27 03:19:00.872707 kernel: rtc_cmos 00:04: registered as rtc0 May 27 03:19:00.872821 kernel: rtc_cmos 00:04: setting system clock to 2025-05-27T03:19:00 UTC (1748315940) May 27 03:19:00.872963 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 27 03:19:00.872987 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 27 03:19:00.872996 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 May 27 03:19:00.873005 kernel: efifb: probing for efifb May 27 03:19:00.873013 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k May 27 03:19:00.873021 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 May 27 03:19:00.873029 kernel: efifb: scrolling: redraw May 27 03:19:00.873037 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 27 03:19:00.873046 kernel: Console: switching to colour frame buffer device 160x50 May 27 03:19:00.873057 kernel: fb0: EFI VGA frame buffer device May 27 03:19:00.873066 kernel: pstore: Using crash dump compression: deflate May 27 03:19:00.873076 kernel: pstore: Registered efi_pstore as persistent store backend May 27 03:19:00.873084 kernel: NET: Registered PF_INET6 protocol family May 27 03:19:00.873092 kernel: Segment Routing with IPv6 May 27 03:19:00.873100 kernel: In-situ OAM (IOAM) with IPv6 May 27 03:19:00.873110 kernel: NET: Registered PF_PACKET protocol family May 27 03:19:00.873118 kernel: Key type dns_resolver registered May 27 03:19:00.873126 kernel: IPI shorthand broadcast: enabled May 27 03:19:00.873134 kernel: sched_clock: Marking stable (3511005308, 140422707)->(3716758576, -65330561) May 27 03:19:00.873142 kernel: registered taskstats version 1 May 27 03:19:00.873151 kernel: Loading compiled-in X.509 certificates May 27 03:19:00.873159 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: ba9eddccb334a70147f3ddfe4fbde029feaa991d' May 27 03:19:00.873167 kernel: Demotion targets for Node 0: null May 27 03:19:00.873175 kernel: Key type .fscrypt registered May 27 03:19:00.873185 kernel: Key type fscrypt-provisioning registered May 27 03:19:00.873193 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 03:19:00.873201 kernel: ima: Allocated hash algorithm: sha1 May 27 03:19:00.873209 kernel: ima: No architecture policies found May 27 03:19:00.873219 kernel: clk: Disabling unused clocks May 27 03:19:00.873227 kernel: Warning: unable to open an initial console. May 27 03:19:00.873236 kernel: Freeing unused kernel image (initmem) memory: 54416K May 27 03:19:00.873244 kernel: Write protecting the kernel read-only data: 24576k May 27 03:19:00.873254 kernel: Freeing unused kernel image (rodata/data gap) memory: 288K May 27 03:19:00.873262 kernel: Run /init as init process May 27 03:19:00.873270 kernel: with arguments: May 27 03:19:00.873278 kernel: /init May 27 03:19:00.873286 kernel: with environment: May 27 03:19:00.873294 kernel: HOME=/ May 27 03:19:00.873302 kernel: TERM=linux May 27 03:19:00.873310 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 03:19:00.873319 systemd[1]: Successfully made /usr/ read-only. May 27 03:19:00.873332 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 03:19:00.873341 systemd[1]: Detected virtualization kvm. May 27 03:19:00.873350 systemd[1]: Detected architecture x86-64. May 27 03:19:00.873358 systemd[1]: Running in initrd. May 27 03:19:00.873366 systemd[1]: No hostname configured, using default hostname. May 27 03:19:00.873375 systemd[1]: Hostname set to . May 27 03:19:00.873383 systemd[1]: Initializing machine ID from VM UUID. May 27 03:19:00.873394 systemd[1]: Queued start job for default target initrd.target. May 27 03:19:00.873403 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:19:00.873411 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:19:00.873420 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 03:19:00.873429 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 03:19:00.873438 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 03:19:00.873447 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 03:19:00.873459 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 03:19:00.873468 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 03:19:00.873476 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:19:00.873485 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 03:19:00.873494 systemd[1]: Reached target paths.target - Path Units. May 27 03:19:00.873502 systemd[1]: Reached target slices.target - Slice Units. May 27 03:19:00.873511 systemd[1]: Reached target swap.target - Swaps. May 27 03:19:00.873519 systemd[1]: Reached target timers.target - Timer Units. May 27 03:19:00.873530 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 03:19:00.873538 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 03:19:00.873547 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 03:19:00.873556 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 03:19:00.873564 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 03:19:00.873573 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 03:19:00.873581 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:19:00.873590 systemd[1]: Reached target sockets.target - Socket Units. May 27 03:19:00.873598 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 03:19:00.873609 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 03:19:00.873618 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 03:19:00.873627 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 03:19:00.873636 systemd[1]: Starting systemd-fsck-usr.service... May 27 03:19:00.873644 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 03:19:00.873653 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 03:19:00.873661 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:19:00.873670 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 03:19:00.873681 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:19:00.873690 systemd[1]: Finished systemd-fsck-usr.service. May 27 03:19:00.873699 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 03:19:00.873730 systemd-journald[219]: Collecting audit messages is disabled. May 27 03:19:00.873753 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:19:00.873762 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 03:19:00.873771 systemd-journald[219]: Journal started May 27 03:19:00.873793 systemd-journald[219]: Runtime Journal (/run/log/journal/40158bd592cc4f049c3add3782d557cf) is 6M, max 48.2M, 42.2M free. May 27 03:19:00.867321 systemd-modules-load[222]: Inserted module 'overlay' May 27 03:19:00.878214 systemd[1]: Started systemd-journald.service - Journal Service. May 27 03:19:00.878290 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 03:19:00.885826 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 03:19:00.889162 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 03:19:00.898988 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 03:19:00.899474 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:19:00.902621 systemd-modules-load[222]: Inserted module 'br_netfilter' May 27 03:19:00.903271 kernel: Bridge firewalling registered May 27 03:19:00.902771 systemd-tmpfiles[242]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 03:19:00.903683 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 03:19:00.905434 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 03:19:00.908049 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:19:00.914911 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 03:19:00.917659 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 03:19:00.920396 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 03:19:00.930650 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 03:19:00.945268 dracut-cmdline[259]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f6c186658a19d5a08471ef76df75f82494b37b46908f9237b2c3cf497da860c6 May 27 03:19:00.969461 systemd-resolved[262]: Positive Trust Anchors: May 27 03:19:00.969477 systemd-resolved[262]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 03:19:00.969511 systemd-resolved[262]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 03:19:00.972287 systemd-resolved[262]: Defaulting to hostname 'linux'. May 27 03:19:00.973552 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 03:19:00.979267 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 03:19:01.061982 kernel: SCSI subsystem initialized May 27 03:19:01.070976 kernel: Loading iSCSI transport class v2.0-870. May 27 03:19:01.081983 kernel: iscsi: registered transport (tcp) May 27 03:19:01.120986 kernel: iscsi: registered transport (qla4xxx) May 27 03:19:01.121042 kernel: QLogic iSCSI HBA Driver May 27 03:19:01.143281 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 03:19:01.176888 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:19:01.180537 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 03:19:01.237991 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 03:19:01.241497 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 03:19:01.296984 kernel: raid6: avx2x4 gen() 30342 MB/s May 27 03:19:01.313976 kernel: raid6: avx2x2 gen() 30726 MB/s May 27 03:19:01.331062 kernel: raid6: avx2x1 gen() 25809 MB/s May 27 03:19:01.331083 kernel: raid6: using algorithm avx2x2 gen() 30726 MB/s May 27 03:19:01.349073 kernel: raid6: .... xor() 19803 MB/s, rmw enabled May 27 03:19:01.349106 kernel: raid6: using avx2x2 recovery algorithm May 27 03:19:01.369981 kernel: xor: automatically using best checksumming function avx May 27 03:19:01.548999 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 03:19:01.558452 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 03:19:01.562498 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:19:01.603964 systemd-udevd[473]: Using default interface naming scheme 'v255'. May 27 03:19:01.611050 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:19:01.615454 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 03:19:01.645094 dracut-pre-trigger[481]: rd.md=0: removing MD RAID activation May 27 03:19:01.676443 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 03:19:01.680229 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 03:19:01.760756 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:19:01.762515 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 03:19:01.799986 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues May 27 03:19:01.804021 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 27 03:19:01.817977 kernel: cryptd: max_cpu_qlen set to 1000 May 27 03:19:01.822982 kernel: libata version 3.00 loaded. May 27 03:19:01.831282 kernel: AES CTR mode by8 optimization enabled May 27 03:19:01.845231 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 03:19:01.845257 kernel: GPT:9289727 != 19775487 May 27 03:19:01.845268 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 03:19:01.846302 kernel: GPT:9289727 != 19775487 May 27 03:19:01.846323 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 03:19:01.847431 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 03:19:01.853082 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 27 03:19:01.859753 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:19:01.865745 kernel: ahci 0000:00:1f.2: version 3.0 May 27 03:19:01.865995 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 27 03:19:01.866010 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 27 03:19:01.859967 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:19:01.870004 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 27 03:19:01.870173 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 27 03:19:01.862693 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:19:01.866305 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:19:01.875598 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 03:19:01.879969 kernel: scsi host0: ahci May 27 03:19:01.885974 kernel: scsi host1: ahci May 27 03:19:01.889971 kernel: scsi host2: ahci May 27 03:19:01.892999 kernel: scsi host3: ahci May 27 03:19:01.894972 kernel: scsi host4: ahci May 27 03:19:01.902481 kernel: scsi host5: ahci May 27 03:19:01.902711 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 0 May 27 03:19:01.902729 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 0 May 27 03:19:01.902742 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 0 May 27 03:19:01.902759 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 0 May 27 03:19:01.902769 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 0 May 27 03:19:01.902445 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 27 03:19:01.905385 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 0 May 27 03:19:01.913977 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 27 03:19:01.914707 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:19:01.938587 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 27 03:19:01.945596 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 27 03:19:01.945871 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 27 03:19:01.947378 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 03:19:02.025756 disk-uuid[637]: Primary Header is updated. May 27 03:19:02.025756 disk-uuid[637]: Secondary Entries is updated. May 27 03:19:02.025756 disk-uuid[637]: Secondary Header is updated. May 27 03:19:02.030976 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 03:19:02.035993 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 03:19:02.210212 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 27 03:19:02.210276 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 27 03:19:02.211072 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 27 03:19:02.211159 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 27 03:19:02.213004 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 27 03:19:02.213100 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 27 03:19:02.214076 kernel: ata3.00: applying bridge limits May 27 03:19:02.214978 kernel: ata3.00: configured for UDMA/100 May 27 03:19:02.216988 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 27 03:19:02.219986 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 27 03:19:02.272045 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 27 03:19:02.272319 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 27 03:19:02.294004 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 27 03:19:02.694673 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 03:19:02.695835 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 03:19:02.697375 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:19:02.700340 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 03:19:02.703655 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 03:19:02.741290 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 03:19:03.037978 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 03:19:03.038341 disk-uuid[638]: The operation has completed successfully. May 27 03:19:03.069659 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 03:19:03.069782 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 03:19:03.101259 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 03:19:03.131754 sh[666]: Success May 27 03:19:03.150706 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 03:19:03.150762 kernel: device-mapper: uevent: version 1.0.3 May 27 03:19:03.150780 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 03:19:03.159994 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 27 03:19:03.190540 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 03:19:03.194854 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 03:19:03.213518 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 03:19:03.219986 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 03:19:03.220013 kernel: BTRFS: device fsid f0f66fe8-3990-49eb-980e-559a3dfd3522 devid 1 transid 40 /dev/mapper/usr (253:0) scanned by mount (678) May 27 03:19:03.221392 kernel: BTRFS info (device dm-0): first mount of filesystem f0f66fe8-3990-49eb-980e-559a3dfd3522 May 27 03:19:03.222293 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 27 03:19:03.222320 kernel: BTRFS info (device dm-0): using free-space-tree May 27 03:19:03.227581 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 03:19:03.229776 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 03:19:03.232047 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 03:19:03.234691 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 03:19:03.237580 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 03:19:03.265985 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (710) May 27 03:19:03.268507 kernel: BTRFS info (device vda6): first mount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:19:03.268531 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 03:19:03.268542 kernel: BTRFS info (device vda6): using free-space-tree May 27 03:19:03.276988 kernel: BTRFS info (device vda6): last unmount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:19:03.278458 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 03:19:03.280596 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 03:19:03.532156 ignition[754]: Ignition 2.21.0 May 27 03:19:03.532171 ignition[754]: Stage: fetch-offline May 27 03:19:03.532231 ignition[754]: no configs at "/usr/lib/ignition/base.d" May 27 03:19:03.532242 ignition[754]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 27 03:19:03.532374 ignition[754]: parsed url from cmdline: "" May 27 03:19:03.532378 ignition[754]: no config URL provided May 27 03:19:03.532383 ignition[754]: reading system config file "/usr/lib/ignition/user.ign" May 27 03:19:03.532394 ignition[754]: no config at "/usr/lib/ignition/user.ign" May 27 03:19:03.532438 ignition[754]: op(1): [started] loading QEMU firmware config module May 27 03:19:03.532444 ignition[754]: op(1): executing: "modprobe" "qemu_fw_cfg" May 27 03:19:03.539944 ignition[754]: op(1): [finished] loading QEMU firmware config module May 27 03:19:03.539996 ignition[754]: QEMU firmware config was not found. Ignoring... May 27 03:19:03.547359 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 03:19:03.550423 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 03:19:03.588667 ignition[754]: parsing config with SHA512: 216e250286acc0d98d0dd16424ebf5713d92629e9bedb7affbb07ecfa76fd9ccfbef9ec5c8e87bb9c5984d6e9e2d78354618da23f9cd8ac8abdfc15acd54de21 May 27 03:19:03.594522 unknown[754]: fetched base config from "system" May 27 03:19:03.594543 unknown[754]: fetched user config from "qemu" May 27 03:19:03.595252 ignition[754]: fetch-offline: fetch-offline passed May 27 03:19:03.596231 systemd-networkd[857]: lo: Link UP May 27 03:19:03.595384 ignition[754]: Ignition finished successfully May 27 03:19:03.596235 systemd-networkd[857]: lo: Gained carrier May 27 03:19:03.597934 systemd-networkd[857]: Enumeration completed May 27 03:19:03.598065 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 03:19:03.598427 systemd-networkd[857]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:19:03.598432 systemd-networkd[857]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:19:03.600634 systemd-networkd[857]: eth0: Link UP May 27 03:19:03.600638 systemd-networkd[857]: eth0: Gained carrier May 27 03:19:03.600646 systemd-networkd[857]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:19:03.601643 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 03:19:03.604241 systemd[1]: Reached target network.target - Network. May 27 03:19:03.606206 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 27 03:19:03.607473 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 03:19:03.619009 systemd-networkd[857]: eth0: DHCPv4 address 10.0.0.98/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 27 03:19:03.642476 ignition[861]: Ignition 2.21.0 May 27 03:19:03.642492 ignition[861]: Stage: kargs May 27 03:19:03.642711 ignition[861]: no configs at "/usr/lib/ignition/base.d" May 27 03:19:03.642722 ignition[861]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 27 03:19:03.646437 ignition[861]: kargs: kargs passed May 27 03:19:03.646538 ignition[861]: Ignition finished successfully May 27 03:19:03.652991 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 03:19:03.655218 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 03:19:03.743337 ignition[870]: Ignition 2.21.0 May 27 03:19:03.743353 ignition[870]: Stage: disks May 27 03:19:03.743538 ignition[870]: no configs at "/usr/lib/ignition/base.d" May 27 03:19:03.743551 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 27 03:19:03.745078 ignition[870]: disks: disks passed May 27 03:19:03.745153 ignition[870]: Ignition finished successfully May 27 03:19:03.748823 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 03:19:03.751464 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 03:19:03.753628 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 03:19:03.756344 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 03:19:03.758393 systemd[1]: Reached target sysinit.target - System Initialization. May 27 03:19:03.760460 systemd[1]: Reached target basic.target - Basic System. May 27 03:19:03.763435 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 03:19:03.803437 systemd-fsck[880]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 27 03:19:03.806419 systemd-resolved[262]: Detected conflict on linux IN A 10.0.0.98 May 27 03:19:03.806450 systemd-resolved[262]: Hostname conflict, changing published hostname from 'linux' to 'linux5'. May 27 03:19:03.812304 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 03:19:03.815353 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 03:19:03.928120 kernel: EXT4-fs (vda9): mounted filesystem 18301365-b380-45d7-9677-e42472a122bc r/w with ordered data mode. Quota mode: none. May 27 03:19:03.928533 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 03:19:03.929385 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 03:19:03.931916 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 03:19:03.934124 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 03:19:03.938048 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 27 03:19:03.938105 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 03:19:03.938137 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 03:19:03.955291 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 03:19:03.958039 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 03:19:03.962082 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (888) May 27 03:19:03.963983 kernel: BTRFS info (device vda6): first mount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:19:03.964003 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 03:19:03.964983 kernel: BTRFS info (device vda6): using free-space-tree May 27 03:19:03.969069 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 03:19:04.014213 initrd-setup-root[912]: cut: /sysroot/etc/passwd: No such file or directory May 27 03:19:04.018807 initrd-setup-root[919]: cut: /sysroot/etc/group: No such file or directory May 27 03:19:04.023205 initrd-setup-root[926]: cut: /sysroot/etc/shadow: No such file or directory May 27 03:19:04.027395 initrd-setup-root[933]: cut: /sysroot/etc/gshadow: No such file or directory May 27 03:19:04.132317 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 03:19:04.133757 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 03:19:04.136579 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 03:19:04.154019 kernel: BTRFS info (device vda6): last unmount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:19:04.171162 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 03:19:04.195342 ignition[1004]: INFO : Ignition 2.21.0 May 27 03:19:04.195342 ignition[1004]: INFO : Stage: mount May 27 03:19:04.198281 ignition[1004]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:19:04.199326 ignition[1004]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 27 03:19:04.201291 ignition[1004]: INFO : mount: mount passed May 27 03:19:04.202100 ignition[1004]: INFO : Ignition finished successfully May 27 03:19:04.205624 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 03:19:04.207467 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 03:19:04.219119 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 03:19:04.232114 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 03:19:04.257972 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (1016) May 27 03:19:04.260091 kernel: BTRFS info (device vda6): first mount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:19:04.260120 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 03:19:04.260132 kernel: BTRFS info (device vda6): using free-space-tree May 27 03:19:04.264097 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 03:19:04.310148 ignition[1033]: INFO : Ignition 2.21.0 May 27 03:19:04.310148 ignition[1033]: INFO : Stage: files May 27 03:19:04.311880 ignition[1033]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:19:04.311880 ignition[1033]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 27 03:19:04.311880 ignition[1033]: DEBUG : files: compiled without relabeling support, skipping May 27 03:19:04.311880 ignition[1033]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 03:19:04.311880 ignition[1033]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 03:19:04.318315 ignition[1033]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 03:19:04.318315 ignition[1033]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 03:19:04.318315 ignition[1033]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 03:19:04.318315 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 27 03:19:04.318315 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 May 27 03:19:04.314657 unknown[1033]: wrote ssh authorized keys file for user: core May 27 03:19:04.357498 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 03:19:04.640624 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 27 03:19:04.643926 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 03:19:04.643926 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 03:19:04.643926 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 03:19:04.643926 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 03:19:04.643926 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 03:19:04.643926 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 03:19:04.643926 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 03:19:04.657665 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 03:19:04.657665 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 03:19:04.657665 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 03:19:04.657665 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:19:04.657665 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:19:04.657665 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:19:04.657665 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 May 27 03:19:05.379170 systemd-networkd[857]: eth0: Gained IPv6LL May 27 03:19:05.408409 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 03:19:05.831831 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:19:05.831831 ignition[1033]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 03:19:05.836202 ignition[1033]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 03:19:05.842411 ignition[1033]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 03:19:05.842411 ignition[1033]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 03:19:05.842411 ignition[1033]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 27 03:19:05.848442 ignition[1033]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 27 03:19:05.848442 ignition[1033]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 27 03:19:05.848442 ignition[1033]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 27 03:19:05.848442 ignition[1033]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 27 03:19:05.870657 ignition[1033]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 27 03:19:05.875921 ignition[1033]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 27 03:19:05.877683 ignition[1033]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 27 03:19:05.877683 ignition[1033]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 27 03:19:05.877683 ignition[1033]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 27 03:19:05.877683 ignition[1033]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 03:19:05.877683 ignition[1033]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 03:19:05.877683 ignition[1033]: INFO : files: files passed May 27 03:19:05.877683 ignition[1033]: INFO : Ignition finished successfully May 27 03:19:05.880135 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 03:19:05.883071 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 03:19:05.902861 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 03:19:05.907086 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 03:19:05.907259 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 03:19:05.917690 initrd-setup-root-after-ignition[1062]: grep: /sysroot/oem/oem-release: No such file or directory May 27 03:19:05.922113 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 03:19:05.922113 initrd-setup-root-after-ignition[1064]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 03:19:05.925680 initrd-setup-root-after-ignition[1068]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 03:19:05.926647 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 03:19:05.928083 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 03:19:05.931377 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 03:19:06.002717 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 03:19:06.002873 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 03:19:06.005182 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 03:19:06.007243 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 03:19:06.009424 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 03:19:06.010350 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 03:19:06.051248 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 03:19:06.053159 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 03:19:06.083232 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 03:19:06.083657 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:19:06.084077 systemd[1]: Stopped target timers.target - Timer Units. May 27 03:19:06.084358 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 03:19:06.084487 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 03:19:06.085214 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 03:19:06.085522 systemd[1]: Stopped target basic.target - Basic System. May 27 03:19:06.085857 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 03:19:06.128520 ignition[1088]: INFO : Ignition 2.21.0 May 27 03:19:06.128520 ignition[1088]: INFO : Stage: umount May 27 03:19:06.128520 ignition[1088]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:19:06.128520 ignition[1088]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 27 03:19:06.128520 ignition[1088]: INFO : umount: umount passed May 27 03:19:06.128520 ignition[1088]: INFO : Ignition finished successfully May 27 03:19:06.086361 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 03:19:06.086695 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 03:19:06.087215 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 03:19:06.087547 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 03:19:06.087901 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 03:19:06.088259 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 03:19:06.088567 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 03:19:06.088905 systemd[1]: Stopped target swap.target - Swaps. May 27 03:19:06.089223 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 03:19:06.089334 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 03:19:06.089912 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 03:19:06.090257 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:19:06.090568 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 03:19:06.090723 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:19:06.091281 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 03:19:06.091405 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 03:19:06.092129 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 03:19:06.092254 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 03:19:06.092703 systemd[1]: Stopped target paths.target - Path Units. May 27 03:19:06.092971 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 03:19:06.098015 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:19:06.098460 systemd[1]: Stopped target slices.target - Slice Units. May 27 03:19:06.098783 systemd[1]: Stopped target sockets.target - Socket Units. May 27 03:19:06.099303 systemd[1]: iscsid.socket: Deactivated successfully. May 27 03:19:06.099408 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 03:19:06.099825 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 03:19:06.099923 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 03:19:06.100373 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 03:19:06.100515 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 03:19:06.100869 systemd[1]: ignition-files.service: Deactivated successfully. May 27 03:19:06.101016 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 03:19:06.102319 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 03:19:06.103555 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 03:19:06.104216 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 03:19:06.104362 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:19:06.104829 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 03:19:06.104977 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 03:19:06.111494 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 03:19:06.111638 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 03:19:06.130944 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 03:19:06.131120 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 03:19:06.134279 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 03:19:06.134828 systemd[1]: Stopped target network.target - Network. May 27 03:19:06.136719 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 03:19:06.136784 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 03:19:06.139007 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 03:19:06.139066 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 03:19:06.141484 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 03:19:06.141551 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 03:19:06.143680 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 03:19:06.143737 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 03:19:06.145931 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 03:19:06.148392 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 03:19:06.156409 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 03:19:06.156548 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 03:19:06.161510 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 03:19:06.161854 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 03:19:06.162241 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 03:19:06.166350 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 03:19:06.167335 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 03:19:06.169998 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 03:19:06.170050 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 03:19:06.173117 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 03:19:06.174681 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 03:19:06.174755 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 03:19:06.176983 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 03:19:06.177058 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 03:19:06.180111 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 03:19:06.180180 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 03:19:06.182259 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 03:19:06.182312 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:19:06.185723 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:19:06.189307 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 03:19:06.189404 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 03:19:06.216638 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 03:19:06.216774 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 03:19:06.230711 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 03:19:06.230906 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:19:06.232464 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 03:19:06.232510 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 03:19:06.234698 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 03:19:06.234734 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:19:06.236872 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 03:19:06.236923 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 03:19:06.241520 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 03:19:06.241571 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 03:19:06.245376 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 03:19:06.245427 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 03:19:06.250425 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 03:19:06.250869 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 03:19:06.250922 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:19:06.255428 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 03:19:06.255478 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:19:06.258721 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 03:19:06.258767 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 03:19:06.262104 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 03:19:06.262154 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:19:06.262642 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:19:06.262683 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:19:06.269006 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 03:19:06.269063 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 03:19:06.269125 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 03:19:06.269173 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 03:19:06.285066 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 03:19:06.285223 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 03:19:06.332154 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 03:19:06.332306 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 03:19:06.332934 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 03:19:06.337477 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 03:19:06.337545 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 03:19:06.341378 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 03:19:06.369276 systemd[1]: Switching root. May 27 03:19:06.407631 systemd-journald[219]: Journal stopped May 27 03:19:07.928050 systemd-journald[219]: Received SIGTERM from PID 1 (systemd). May 27 03:19:07.928153 kernel: SELinux: policy capability network_peer_controls=1 May 27 03:19:07.928175 kernel: SELinux: policy capability open_perms=1 May 27 03:19:07.928193 kernel: SELinux: policy capability extended_socket_class=1 May 27 03:19:07.928210 kernel: SELinux: policy capability always_check_network=0 May 27 03:19:07.928235 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 03:19:07.928256 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 03:19:07.928275 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 03:19:07.928290 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 03:19:07.928305 kernel: SELinux: policy capability userspace_initial_context=0 May 27 03:19:07.928354 kernel: audit: type=1403 audit(1748315946.864:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 03:19:07.928371 systemd[1]: Successfully loaded SELinux policy in 59.939ms. May 27 03:19:07.928409 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 13.949ms. May 27 03:19:07.928432 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 03:19:07.928452 systemd[1]: Detected virtualization kvm. May 27 03:19:07.928468 systemd[1]: Detected architecture x86-64. May 27 03:19:07.928484 systemd[1]: Detected first boot. May 27 03:19:07.928500 systemd[1]: Initializing machine ID from VM UUID. May 27 03:19:07.928516 zram_generator::config[1135]: No configuration found. May 27 03:19:07.928534 kernel: Guest personality initialized and is inactive May 27 03:19:07.928549 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 27 03:19:07.928564 kernel: Initialized host personality May 27 03:19:07.928583 kernel: NET: Registered PF_VSOCK protocol family May 27 03:19:07.928598 systemd[1]: Populated /etc with preset unit settings. May 27 03:19:07.928616 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 03:19:07.928632 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 03:19:07.928648 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 03:19:07.928664 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 03:19:07.928680 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 03:19:07.928697 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 03:19:07.928718 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 03:19:07.928748 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 03:19:07.928764 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 03:19:07.928797 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 03:19:07.928824 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 03:19:07.928843 systemd[1]: Created slice user.slice - User and Session Slice. May 27 03:19:07.928859 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:19:07.928878 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:19:07.928894 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 03:19:07.928910 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 03:19:07.928932 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 03:19:07.929012 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 03:19:07.929031 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 03:19:07.929046 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:19:07.929063 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 03:19:07.930145 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 03:19:07.930170 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 03:19:07.930193 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 03:19:07.930209 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 03:19:07.930233 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:19:07.930251 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 03:19:07.930268 systemd[1]: Reached target slices.target - Slice Units. May 27 03:19:07.930285 systemd[1]: Reached target swap.target - Swaps. May 27 03:19:07.930302 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 03:19:07.930318 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 03:19:07.930335 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 03:19:07.930352 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 03:19:07.930374 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 03:19:07.930390 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:19:07.930415 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 03:19:07.930434 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 03:19:07.930451 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 03:19:07.930467 systemd[1]: Mounting media.mount - External Media Directory... May 27 03:19:07.930483 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:19:07.930515 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 03:19:07.930536 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 03:19:07.930553 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 03:19:07.930569 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 03:19:07.930586 systemd[1]: Reached target machines.target - Containers. May 27 03:19:07.930602 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 03:19:07.930619 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:19:07.930635 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 03:19:07.930651 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 03:19:07.930670 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 03:19:07.930690 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 03:19:07.930706 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 03:19:07.930726 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 03:19:07.930742 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 03:19:07.930758 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 03:19:07.930774 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 03:19:07.930800 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 03:19:07.930816 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 03:19:07.930836 systemd[1]: Stopped systemd-fsck-usr.service. May 27 03:19:07.930853 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:19:07.930869 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 03:19:07.930896 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 03:19:07.930912 kernel: fuse: init (API version 7.41) May 27 03:19:07.930927 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 03:19:07.930943 kernel: loop: module loaded May 27 03:19:07.930981 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 03:19:07.930999 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 03:19:07.931019 kernel: ACPI: bus type drm_connector registered May 27 03:19:07.931044 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 03:19:07.931063 systemd[1]: verity-setup.service: Deactivated successfully. May 27 03:19:07.931093 systemd[1]: Stopped verity-setup.service. May 27 03:19:07.931112 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:19:07.931174 systemd-journald[1199]: Collecting audit messages is disabled. May 27 03:19:07.931218 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 03:19:07.931236 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 03:19:07.931252 systemd[1]: Mounted media.mount - External Media Directory. May 27 03:19:07.931272 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 03:19:07.931288 systemd-journald[1199]: Journal started May 27 03:19:07.931318 systemd-journald[1199]: Runtime Journal (/run/log/journal/40158bd592cc4f049c3add3782d557cf) is 6M, max 48.2M, 42.2M free. May 27 03:19:07.549914 systemd[1]: Queued start job for default target multi-user.target. May 27 03:19:07.932869 systemd[1]: Started systemd-journald.service - Journal Service. May 27 03:19:07.572230 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 27 03:19:07.572735 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 03:19:07.935773 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 03:19:07.938353 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 03:19:07.939796 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:19:07.941560 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 03:19:07.941788 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 03:19:07.943462 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 03:19:07.943684 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 03:19:07.945320 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 03:19:07.945539 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 03:19:07.947910 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 03:19:07.948137 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 03:19:07.949896 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 03:19:07.950133 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 03:19:07.951671 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 03:19:07.951892 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 03:19:07.953475 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 03:19:07.955261 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:19:07.957033 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 03:19:07.958911 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 03:19:07.972578 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 03:19:07.978058 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 03:19:07.981110 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 03:19:07.983032 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 03:19:07.983070 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 03:19:07.985110 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 03:19:08.008586 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 03:19:08.009790 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:19:08.025927 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 03:19:08.028977 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 03:19:08.030841 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 03:19:08.040167 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 03:19:08.042237 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 03:19:08.043382 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 03:19:08.047179 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 03:19:08.048432 systemd-journald[1199]: Time spent on flushing to /var/log/journal/40158bd592cc4f049c3add3782d557cf is 21.396ms for 1039 entries. May 27 03:19:08.048432 systemd-journald[1199]: System Journal (/var/log/journal/40158bd592cc4f049c3add3782d557cf) is 8M, max 195.6M, 187.6M free. May 27 03:19:08.431227 systemd-journald[1199]: Received client request to flush runtime journal. May 27 03:19:08.431303 kernel: loop0: detected capacity change from 0 to 146240 May 27 03:19:08.431331 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 03:19:08.431352 kernel: loop1: detected capacity change from 0 to 224512 May 27 03:19:08.053153 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 03:19:08.060361 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:19:08.097503 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 03:19:08.121080 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 03:19:08.155197 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 03:19:08.159219 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. May 27 03:19:08.159232 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. May 27 03:19:08.165296 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 03:19:08.317222 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 03:19:08.318693 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 03:19:08.323157 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 03:19:08.338466 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 03:19:08.342637 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 03:19:08.436032 kernel: loop2: detected capacity change from 0 to 113872 May 27 03:19:08.435963 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 03:19:08.449371 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 03:19:08.455392 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 03:19:08.511519 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 03:19:08.532003 kernel: loop3: detected capacity change from 0 to 146240 May 27 03:19:08.532189 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. May 27 03:19:08.532203 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. May 27 03:19:08.538393 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:19:08.549983 kernel: loop4: detected capacity change from 0 to 224512 May 27 03:19:08.560003 kernel: loop5: detected capacity change from 0 to 113872 May 27 03:19:08.566370 (sd-merge)[1279]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 27 03:19:08.567332 (sd-merge)[1279]: Merged extensions into '/usr'. May 27 03:19:08.584448 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 03:19:08.589136 systemd[1]: Reload requested from client PID 1246 ('systemd-sysext') (unit systemd-sysext.service)... May 27 03:19:08.589158 systemd[1]: Reloading... May 27 03:19:08.704020 zram_generator::config[1307]: No configuration found. May 27 03:19:08.859408 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:19:08.940899 ldconfig[1241]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 03:19:08.952639 systemd[1]: Reloading finished in 362 ms. May 27 03:19:08.979344 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 03:19:08.981213 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 03:19:09.003435 systemd[1]: Starting ensure-sysext.service... May 27 03:19:09.005376 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 03:19:09.016719 systemd[1]: Reload requested from client PID 1343 ('systemctl') (unit ensure-sysext.service)... May 27 03:19:09.016737 systemd[1]: Reloading... May 27 03:19:09.036150 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 03:19:09.036191 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 03:19:09.036497 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 03:19:09.036747 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 03:19:09.037664 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 03:19:09.037974 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. May 27 03:19:09.038050 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. May 27 03:19:09.042728 systemd-tmpfiles[1344]: Detected autofs mount point /boot during canonicalization of boot. May 27 03:19:09.042741 systemd-tmpfiles[1344]: Skipping /boot May 27 03:19:09.063863 systemd-tmpfiles[1344]: Detected autofs mount point /boot during canonicalization of boot. May 27 03:19:09.064045 systemd-tmpfiles[1344]: Skipping /boot May 27 03:19:09.199002 zram_generator::config[1374]: No configuration found. May 27 03:19:09.403577 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:19:09.485286 systemd[1]: Reloading finished in 468 ms. May 27 03:19:09.512802 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 03:19:09.552138 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:19:09.562592 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 03:19:09.565505 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 03:19:09.590881 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 03:19:09.594608 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 03:19:09.600221 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:19:09.603430 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 03:19:09.608171 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:19:09.608358 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:19:09.610603 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 03:19:09.613240 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 03:19:09.617128 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 03:19:09.618294 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:19:09.618402 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:19:09.620359 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 03:19:09.621601 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:19:09.622924 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 03:19:09.623175 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 03:19:09.626387 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 03:19:09.626605 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 03:19:09.655015 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 03:19:09.657162 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 03:19:09.657393 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 03:19:09.668784 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 03:19:09.675421 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:19:09.675718 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:19:09.677256 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 03:19:09.678252 systemd-udevd[1415]: Using default interface naming scheme 'v255'. May 27 03:19:09.711240 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 03:19:09.714152 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 03:19:09.717444 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 03:19:09.718717 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:19:09.718861 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:19:09.725270 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 03:19:09.726392 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:19:09.727717 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 03:19:09.729633 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 03:19:09.729874 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 03:19:09.731526 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 03:19:09.731755 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 03:19:09.734112 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 03:19:09.734331 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 03:19:09.736076 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 03:19:09.736306 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 03:19:09.740474 systemd[1]: Finished ensure-sysext.service. May 27 03:19:09.748694 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 03:19:09.748770 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 03:19:09.759089 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 03:19:09.762311 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:19:09.764150 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 03:19:09.779372 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 03:19:09.780422 augenrules[1474]: No rules May 27 03:19:09.781510 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 03:19:09.784928 systemd[1]: audit-rules.service: Deactivated successfully. May 27 03:19:09.785324 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 03:19:09.790217 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 03:19:09.870153 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 03:19:09.982982 kernel: mousedev: PS/2 mouse device common for all mice May 27 03:19:10.006729 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 27 03:19:10.009970 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 03:19:10.027006 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 May 27 03:19:10.033732 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 03:19:10.050986 kernel: ACPI: button: Power Button [PWRF] May 27 03:19:10.121619 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 03:19:10.123248 systemd[1]: Reached target time-set.target - System Time Set. May 27 03:19:10.173350 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device May 27 03:19:10.173686 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 27 03:19:10.190144 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 27 03:19:10.191631 systemd-networkd[1475]: lo: Link UP May 27 03:19:10.191649 systemd-networkd[1475]: lo: Gained carrier May 27 03:19:10.194136 systemd-networkd[1475]: Enumeration completed May 27 03:19:10.194667 systemd-networkd[1475]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:19:10.194680 systemd-networkd[1475]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:19:10.195201 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 03:19:10.198313 systemd-networkd[1475]: eth0: Link UP May 27 03:19:10.198525 systemd-networkd[1475]: eth0: Gained carrier May 27 03:19:10.198547 systemd-networkd[1475]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:19:10.199054 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 03:19:10.204251 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 03:19:10.221000 systemd-networkd[1475]: eth0: DHCPv4 address 10.0.0.98/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 27 03:19:10.222005 systemd-timesyncd[1458]: Network configuration changed, trying to establish connection. May 27 03:19:10.969070 systemd-resolved[1413]: Positive Trust Anchors: May 27 03:19:10.969088 systemd-resolved[1413]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 03:19:10.969104 systemd-timesyncd[1458]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 27 03:19:10.969120 systemd-resolved[1413]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 03:19:10.969175 systemd-timesyncd[1458]: Initial clock synchronization to Tue 2025-05-27 03:19:10.968912 UTC. May 27 03:19:10.976714 systemd-resolved[1413]: Defaulting to hostname 'linux'. May 27 03:19:10.979524 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 03:19:10.980942 systemd[1]: Reached target network.target - Network. May 27 03:19:10.981928 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 03:19:10.983196 systemd[1]: Reached target sysinit.target - System Initialization. May 27 03:19:10.984538 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 03:19:10.986553 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 03:19:10.987854 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 27 03:19:10.989307 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 03:19:11.004811 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 03:19:11.022916 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 03:19:11.024324 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 03:19:11.024378 systemd[1]: Reached target paths.target - Path Units. May 27 03:19:11.025340 systemd[1]: Reached target timers.target - Timer Units. May 27 03:19:11.027470 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 03:19:11.031208 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 03:19:11.038203 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 03:19:11.039675 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 03:19:11.040985 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 03:19:11.052744 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 03:19:11.054674 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 03:19:11.057417 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 03:19:11.059999 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 03:19:11.079467 systemd[1]: Reached target sockets.target - Socket Units. May 27 03:19:11.081073 systemd[1]: Reached target basic.target - Basic System. May 27 03:19:11.082652 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 03:19:11.083158 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 03:19:11.085798 systemd[1]: Starting containerd.service - containerd container runtime... May 27 03:19:11.088139 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 03:19:11.093060 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 03:19:11.095427 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 03:19:11.104027 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 03:19:11.105259 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 03:19:11.108535 kernel: kvm_amd: TSC scaling supported May 27 03:19:11.108571 kernel: kvm_amd: Nested Virtualization enabled May 27 03:19:11.108584 kernel: kvm_amd: Nested Paging enabled May 27 03:19:11.108597 kernel: kvm_amd: LBR virtualization supported May 27 03:19:11.109734 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 27 03:19:11.109801 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported May 27 03:19:11.109817 kernel: kvm_amd: Virtual GIF supported May 27 03:19:11.114989 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 03:19:11.118509 jq[1535]: false May 27 03:19:11.119711 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 03:19:11.123498 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 03:19:11.137457 extend-filesystems[1536]: Found loop3 May 27 03:19:11.137457 extend-filesystems[1536]: Found loop4 May 27 03:19:11.137457 extend-filesystems[1536]: Found loop5 May 27 03:19:11.137457 extend-filesystems[1536]: Found sr0 May 27 03:19:11.137457 extend-filesystems[1536]: Found vda May 27 03:19:11.137457 extend-filesystems[1536]: Found vda1 May 27 03:19:11.137457 extend-filesystems[1536]: Found vda2 May 27 03:19:11.137457 extend-filesystems[1536]: Found vda3 May 27 03:19:11.137457 extend-filesystems[1536]: Found usr May 27 03:19:11.137457 extend-filesystems[1536]: Found vda4 May 27 03:19:11.137457 extend-filesystems[1536]: Found vda6 May 27 03:19:11.137457 extend-filesystems[1536]: Found vda7 May 27 03:19:11.137457 extend-filesystems[1536]: Found vda9 May 27 03:19:11.137457 extend-filesystems[1536]: Checking size of /dev/vda9 May 27 03:19:11.216097 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 27 03:19:11.198606 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 03:19:11.216209 extend-filesystems[1536]: Resized partition /dev/vda9 May 27 03:19:11.215934 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 03:19:11.219819 extend-filesystems[1547]: resize2fs 1.47.2 (1-Jan-2025) May 27 03:19:11.223720 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Refreshing passwd entry cache May 27 03:19:11.219561 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:19:11.222989 oslogin_cache_refresh[1537]: Refreshing passwd entry cache May 27 03:19:11.224540 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 03:19:11.225160 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 03:19:11.228702 systemd[1]: Starting update-engine.service - Update Engine... May 27 03:19:11.231693 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 03:19:11.235464 kernel: EDAC MC: Ver: 3.0.0 May 27 03:19:11.241456 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 27 03:19:11.271028 extend-filesystems[1547]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 27 03:19:11.271028 extend-filesystems[1547]: old_desc_blocks = 1, new_desc_blocks = 1 May 27 03:19:11.271028 extend-filesystems[1547]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 27 03:19:11.273862 oslogin_cache_refresh[1537]: Failure getting users, quitting May 27 03:19:11.290171 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 03:19:11.291976 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Failure getting users, quitting May 27 03:19:11.291976 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 03:19:11.291976 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Refreshing group entry cache May 27 03:19:11.291976 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Failure getting groups, quitting May 27 03:19:11.291976 google_oslogin_nss_cache[1537]: oslogin_cache_refresh[1537]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 03:19:11.292096 extend-filesystems[1536]: Resized filesystem in /dev/vda9 May 27 03:19:11.273887 oslogin_cache_refresh[1537]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 03:19:11.291314 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 03:19:11.273944 oslogin_cache_refresh[1537]: Refreshing group entry cache May 27 03:19:11.281038 oslogin_cache_refresh[1537]: Failure getting groups, quitting May 27 03:19:11.281052 oslogin_cache_refresh[1537]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 03:19:11.292674 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 03:19:11.299019 jq[1556]: true May 27 03:19:11.293278 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 03:19:11.293578 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 03:19:11.294853 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 27 03:19:11.295087 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 27 03:19:11.297150 systemd[1]: motdgen.service: Deactivated successfully. May 27 03:19:11.297394 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 03:19:11.300806 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 03:19:11.301343 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 03:19:11.309535 update_engine[1555]: I20250527 03:19:11.308641 1555 main.cc:92] Flatcar Update Engine starting May 27 03:19:11.312237 (ntainerd)[1564]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 03:19:11.315754 jq[1563]: true May 27 03:19:11.343675 tar[1562]: linux-amd64/LICENSE May 27 03:19:11.349554 tar[1562]: linux-amd64/helm May 27 03:19:11.382567 dbus-daemon[1533]: [system] SELinux support is enabled May 27 03:19:11.383159 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 03:19:11.386890 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 03:19:11.386929 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 03:19:11.387008 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 03:19:11.387031 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 03:19:11.397919 update_engine[1555]: I20250527 03:19:11.397835 1555 update_check_scheduler.cc:74] Next update check in 9m45s May 27 03:19:11.398122 systemd[1]: Started update-engine.service - Update Engine. May 27 03:19:11.400744 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 03:19:11.431168 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:19:11.451628 systemd-logind[1549]: Watching system buttons on /dev/input/event2 (Power Button) May 27 03:19:11.451659 systemd-logind[1549]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 27 03:19:11.453130 systemd-logind[1549]: New seat seat0. May 27 03:19:11.456072 systemd[1]: Started systemd-logind.service - User Login Management. May 27 03:19:11.459426 bash[1599]: Updated "/home/core/.ssh/authorized_keys" May 27 03:19:11.465798 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 03:19:11.470351 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 27 03:19:11.514824 locksmithd[1591]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 03:19:11.579685 sshd_keygen[1579]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 03:19:11.676756 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 03:19:11.681786 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 03:19:11.704536 systemd[1]: issuegen.service: Deactivated successfully. May 27 03:19:11.704856 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 03:19:11.705039 containerd[1564]: time="2025-05-27T03:19:11Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 03:19:11.710460 containerd[1564]: time="2025-05-27T03:19:11.708608413Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 03:19:11.709664 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797144641Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="24.676µs" May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797222878Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797270588Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797537027Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797554911Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797590117Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797666971Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.797678373Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.798037266Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.798053266Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.798064737Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 03:19:11.798798 containerd[1564]: time="2025-05-27T03:19:11.798072492Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 03:19:11.799122 containerd[1564]: time="2025-05-27T03:19:11.798188800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 03:19:11.799122 containerd[1564]: time="2025-05-27T03:19:11.798503109Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 03:19:11.799122 containerd[1564]: time="2025-05-27T03:19:11.798538486Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 03:19:11.799122 containerd[1564]: time="2025-05-27T03:19:11.798548274Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 03:19:11.799122 containerd[1564]: time="2025-05-27T03:19:11.798601244Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 03:19:11.799689 containerd[1564]: time="2025-05-27T03:19:11.799654860Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 03:19:11.799781 containerd[1564]: time="2025-05-27T03:19:11.799751842Z" level=info msg="metadata content store policy set" policy=shared May 27 03:19:11.810180 containerd[1564]: time="2025-05-27T03:19:11.810147681Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 03:19:11.810229 containerd[1564]: time="2025-05-27T03:19:11.810208886Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 03:19:11.810229 containerd[1564]: time="2025-05-27T03:19:11.810223413Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 03:19:11.810267 containerd[1564]: time="2025-05-27T03:19:11.810234755Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 03:19:11.810267 containerd[1564]: time="2025-05-27T03:19:11.810247569Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 03:19:11.810267 containerd[1564]: time="2025-05-27T03:19:11.810259431Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 03:19:11.810335 containerd[1564]: time="2025-05-27T03:19:11.810272235Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 03:19:11.810335 containerd[1564]: time="2025-05-27T03:19:11.810285961Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 03:19:11.810335 containerd[1564]: time="2025-05-27T03:19:11.810297813Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 03:19:11.810335 containerd[1564]: time="2025-05-27T03:19:11.810307261Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 03:19:11.810335 containerd[1564]: time="2025-05-27T03:19:11.810315797Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 03:19:11.810335 containerd[1564]: time="2025-05-27T03:19:11.810328440Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 03:19:11.810515 containerd[1564]: time="2025-05-27T03:19:11.810490985Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 03:19:11.810542 containerd[1564]: time="2025-05-27T03:19:11.810518507Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 03:19:11.810542 containerd[1564]: time="2025-05-27T03:19:11.810532453Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 03:19:11.810579 containerd[1564]: time="2025-05-27T03:19:11.810547311Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 03:19:11.810579 containerd[1564]: time="2025-05-27T03:19:11.810568401Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 03:19:11.810615 containerd[1564]: time="2025-05-27T03:19:11.810580032Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 03:19:11.810615 containerd[1564]: time="2025-05-27T03:19:11.810592355Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 03:19:11.810615 containerd[1564]: time="2025-05-27T03:19:11.810605981Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 03:19:11.810682 containerd[1564]: time="2025-05-27T03:19:11.810618094Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 03:19:11.810682 containerd[1564]: time="2025-05-27T03:19:11.810629215Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 03:19:11.810682 containerd[1564]: time="2025-05-27T03:19:11.810640215Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 03:19:11.810761 containerd[1564]: time="2025-05-27T03:19:11.810739021Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 03:19:11.810761 containerd[1564]: time="2025-05-27T03:19:11.810758968Z" level=info msg="Start snapshots syncer" May 27 03:19:11.810804 containerd[1564]: time="2025-05-27T03:19:11.810790116Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 03:19:11.811146 containerd[1564]: time="2025-05-27T03:19:11.811093025Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 03:19:11.811324 containerd[1564]: time="2025-05-27T03:19:11.811169187Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 03:19:11.811324 containerd[1564]: time="2025-05-27T03:19:11.811269966Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811388980Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811428534Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811452459Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811463299Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811475241Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811485330Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811495109Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811517110Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811526678Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 03:19:11.811957 containerd[1564]: time="2025-05-27T03:19:11.811536326Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 03:19:11.812782 containerd[1564]: time="2025-05-27T03:19:11.812752728Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 03:19:11.812782 containerd[1564]: time="2025-05-27T03:19:11.812776552Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 03:19:11.812849 containerd[1564]: time="2025-05-27T03:19:11.812786531Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 03:19:11.812871 containerd[1564]: time="2025-05-27T03:19:11.812861351Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 03:19:11.812896 containerd[1564]: time="2025-05-27T03:19:11.812872683Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 03:19:11.812896 containerd[1564]: time="2025-05-27T03:19:11.812882411Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 03:19:11.812941 containerd[1564]: time="2025-05-27T03:19:11.812895495Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 03:19:11.812941 containerd[1564]: time="2025-05-27T03:19:11.812918148Z" level=info msg="runtime interface created" May 27 03:19:11.812941 containerd[1564]: time="2025-05-27T03:19:11.812923939Z" level=info msg="created NRI interface" May 27 03:19:11.812941 containerd[1564]: time="2025-05-27T03:19:11.812932595Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 03:19:11.813012 containerd[1564]: time="2025-05-27T03:19:11.812945690Z" level=info msg="Connect containerd service" May 27 03:19:11.813012 containerd[1564]: time="2025-05-27T03:19:11.812982388Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 03:19:11.816661 containerd[1564]: time="2025-05-27T03:19:11.816618930Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 03:19:11.823173 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 03:19:11.829714 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 03:19:11.833680 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 03:19:11.834937 systemd[1]: Reached target getty.target - Login Prompts. May 27 03:19:12.064316 containerd[1564]: time="2025-05-27T03:19:12.064204549Z" level=info msg="Start subscribing containerd event" May 27 03:19:12.064531 containerd[1564]: time="2025-05-27T03:19:12.064391310Z" level=info msg="Start recovering state" May 27 03:19:12.064861 containerd[1564]: time="2025-05-27T03:19:12.064784898Z" level=info msg="Start event monitor" May 27 03:19:12.064861 containerd[1564]: time="2025-05-27T03:19:12.064860149Z" level=info msg="Start cni network conf syncer for default" May 27 03:19:12.065079 containerd[1564]: time="2025-05-27T03:19:12.064910323Z" level=info msg="Start streaming server" May 27 03:19:12.065079 containerd[1564]: time="2025-05-27T03:19:12.064995713Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 03:19:12.065079 containerd[1564]: time="2025-05-27T03:19:12.065019909Z" level=info msg="runtime interface starting up..." May 27 03:19:12.065079 containerd[1564]: time="2025-05-27T03:19:12.065040618Z" level=info msg="starting plugins..." May 27 03:19:12.065222 containerd[1564]: time="2025-05-27T03:19:12.065091643Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 03:19:12.065470 containerd[1564]: time="2025-05-27T03:19:12.065091433Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 03:19:12.065717 containerd[1564]: time="2025-05-27T03:19:12.065690216Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 03:19:12.066039 systemd[1]: Started containerd.service - containerd container runtime. May 27 03:19:12.067472 containerd[1564]: time="2025-05-27T03:19:12.066879006Z" level=info msg="containerd successfully booted in 0.363060s" May 27 03:19:12.111857 tar[1562]: linux-amd64/README.md May 27 03:19:12.140095 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 03:19:12.396747 systemd-networkd[1475]: eth0: Gained IPv6LL May 27 03:19:12.400181 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 03:19:12.402305 systemd[1]: Reached target network-online.target - Network is Online. May 27 03:19:12.405380 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 27 03:19:12.407934 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:19:12.410199 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 03:19:12.443208 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 03:19:12.456332 systemd[1]: coreos-metadata.service: Deactivated successfully. May 27 03:19:12.456691 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 27 03:19:12.458371 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 03:19:13.855834 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:19:13.857575 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 03:19:13.859733 (kubelet)[1670]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:19:13.859833 systemd[1]: Startup finished in 3.590s (kernel) + 6.202s (initrd) + 6.307s (userspace) = 16.100s. May 27 03:19:14.519670 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 03:19:14.521390 systemd[1]: Started sshd@0-10.0.0.98:22-10.0.0.1:59440.service - OpenSSH per-connection server daemon (10.0.0.1:59440). May 27 03:19:14.568090 kubelet[1670]: E0527 03:19:14.568001 1670 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:19:14.572295 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:19:14.572576 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:19:14.573096 systemd[1]: kubelet.service: Consumed 1.968s CPU time, 265.4M memory peak. May 27 03:19:14.599390 sshd[1682]: Accepted publickey for core from 10.0.0.1 port 59440 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:19:14.613810 sshd-session[1682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:19:14.629811 systemd-logind[1549]: New session 1 of user core. May 27 03:19:14.631467 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 03:19:14.633087 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 03:19:14.671874 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 03:19:14.675471 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 03:19:14.693429 (systemd)[1687]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 03:19:14.696344 systemd-logind[1549]: New session c1 of user core. May 27 03:19:14.863285 systemd[1687]: Queued start job for default target default.target. May 27 03:19:14.883508 systemd[1687]: Created slice app.slice - User Application Slice. May 27 03:19:14.883534 systemd[1687]: Reached target paths.target - Paths. May 27 03:19:14.883575 systemd[1687]: Reached target timers.target - Timers. May 27 03:19:14.885208 systemd[1687]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 03:19:14.899270 systemd[1687]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 03:19:14.899397 systemd[1687]: Reached target sockets.target - Sockets. May 27 03:19:14.899456 systemd[1687]: Reached target basic.target - Basic System. May 27 03:19:14.899500 systemd[1687]: Reached target default.target - Main User Target. May 27 03:19:14.899540 systemd[1687]: Startup finished in 195ms. May 27 03:19:14.900269 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 03:19:14.902234 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 03:19:14.969160 systemd[1]: Started sshd@1-10.0.0.98:22-10.0.0.1:59442.service - OpenSSH per-connection server daemon (10.0.0.1:59442). May 27 03:19:15.029676 sshd[1698]: Accepted publickey for core from 10.0.0.1 port 59442 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:19:15.031588 sshd-session[1698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:19:15.038070 systemd-logind[1549]: New session 2 of user core. May 27 03:19:15.048599 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 03:19:15.104567 sshd[1700]: Connection closed by 10.0.0.1 port 59442 May 27 03:19:15.104973 sshd-session[1698]: pam_unix(sshd:session): session closed for user core May 27 03:19:15.116360 systemd[1]: sshd@1-10.0.0.98:22-10.0.0.1:59442.service: Deactivated successfully. May 27 03:19:15.118363 systemd[1]: session-2.scope: Deactivated successfully. May 27 03:19:15.119127 systemd-logind[1549]: Session 2 logged out. Waiting for processes to exit. May 27 03:19:15.122192 systemd[1]: Started sshd@2-10.0.0.98:22-10.0.0.1:59456.service - OpenSSH per-connection server daemon (10.0.0.1:59456). May 27 03:19:15.123024 systemd-logind[1549]: Removed session 2. May 27 03:19:15.180952 sshd[1706]: Accepted publickey for core from 10.0.0.1 port 59456 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:19:15.182587 sshd-session[1706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:19:15.187157 systemd-logind[1549]: New session 3 of user core. May 27 03:19:15.195653 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 03:19:15.246397 sshd[1708]: Connection closed by 10.0.0.1 port 59456 May 27 03:19:15.246824 sshd-session[1706]: pam_unix(sshd:session): session closed for user core May 27 03:19:15.257355 systemd[1]: sshd@2-10.0.0.98:22-10.0.0.1:59456.service: Deactivated successfully. May 27 03:19:15.259468 systemd[1]: session-3.scope: Deactivated successfully. May 27 03:19:15.260326 systemd-logind[1549]: Session 3 logged out. Waiting for processes to exit. May 27 03:19:15.263620 systemd[1]: Started sshd@3-10.0.0.98:22-10.0.0.1:59466.service - OpenSSH per-connection server daemon (10.0.0.1:59466). May 27 03:19:15.264245 systemd-logind[1549]: Removed session 3. May 27 03:19:15.315991 sshd[1714]: Accepted publickey for core from 10.0.0.1 port 59466 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:19:15.317423 sshd-session[1714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:19:15.321972 systemd-logind[1549]: New session 4 of user core. May 27 03:19:15.331566 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 03:19:15.386899 sshd[1716]: Connection closed by 10.0.0.1 port 59466 May 27 03:19:15.387690 sshd-session[1714]: pam_unix(sshd:session): session closed for user core May 27 03:19:15.401142 systemd[1]: sshd@3-10.0.0.98:22-10.0.0.1:59466.service: Deactivated successfully. May 27 03:19:15.402985 systemd[1]: session-4.scope: Deactivated successfully. May 27 03:19:15.403796 systemd-logind[1549]: Session 4 logged out. Waiting for processes to exit. May 27 03:19:15.406883 systemd[1]: Started sshd@4-10.0.0.98:22-10.0.0.1:59474.service - OpenSSH per-connection server daemon (10.0.0.1:59474). May 27 03:19:15.407558 systemd-logind[1549]: Removed session 4. May 27 03:19:15.463296 sshd[1722]: Accepted publickey for core from 10.0.0.1 port 59474 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:19:15.464758 sshd-session[1722]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:19:15.469148 systemd-logind[1549]: New session 5 of user core. May 27 03:19:15.476643 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 03:19:15.536801 sudo[1725]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 03:19:15.537138 sudo[1725]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:19:15.563287 sudo[1725]: pam_unix(sudo:session): session closed for user root May 27 03:19:15.565166 sshd[1724]: Connection closed by 10.0.0.1 port 59474 May 27 03:19:15.565658 sshd-session[1722]: pam_unix(sshd:session): session closed for user core May 27 03:19:15.591144 systemd[1]: sshd@4-10.0.0.98:22-10.0.0.1:59474.service: Deactivated successfully. May 27 03:19:15.593013 systemd[1]: session-5.scope: Deactivated successfully. May 27 03:19:15.593766 systemd-logind[1549]: Session 5 logged out. Waiting for processes to exit. May 27 03:19:15.596864 systemd[1]: Started sshd@5-10.0.0.98:22-10.0.0.1:59476.service - OpenSSH per-connection server daemon (10.0.0.1:59476). May 27 03:19:15.597739 systemd-logind[1549]: Removed session 5. May 27 03:19:15.656367 sshd[1731]: Accepted publickey for core from 10.0.0.1 port 59476 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:19:15.657833 sshd-session[1731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:19:15.662507 systemd-logind[1549]: New session 6 of user core. May 27 03:19:15.669595 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 03:19:15.723417 sudo[1735]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 03:19:15.723753 sudo[1735]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:19:15.730931 sudo[1735]: pam_unix(sudo:session): session closed for user root May 27 03:19:15.737695 sudo[1734]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 03:19:15.738000 sudo[1734]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:19:15.748230 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 03:19:15.801465 augenrules[1757]: No rules May 27 03:19:15.803389 systemd[1]: audit-rules.service: Deactivated successfully. May 27 03:19:15.803713 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 03:19:15.804988 sudo[1734]: pam_unix(sudo:session): session closed for user root May 27 03:19:15.806721 sshd[1733]: Connection closed by 10.0.0.1 port 59476 May 27 03:19:15.807118 sshd-session[1731]: pam_unix(sshd:session): session closed for user core May 27 03:19:15.817196 systemd[1]: sshd@5-10.0.0.98:22-10.0.0.1:59476.service: Deactivated successfully. May 27 03:19:15.819470 systemd[1]: session-6.scope: Deactivated successfully. May 27 03:19:15.820279 systemd-logind[1549]: Session 6 logged out. Waiting for processes to exit. May 27 03:19:15.824233 systemd[1]: Started sshd@6-10.0.0.98:22-10.0.0.1:59484.service - OpenSSH per-connection server daemon (10.0.0.1:59484). May 27 03:19:15.825105 systemd-logind[1549]: Removed session 6. May 27 03:19:15.887102 sshd[1766]: Accepted publickey for core from 10.0.0.1 port 59484 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:19:15.888739 sshd-session[1766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:19:15.893945 systemd-logind[1549]: New session 7 of user core. May 27 03:19:15.910679 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 03:19:15.965164 sudo[1769]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 03:19:15.965506 sudo[1769]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:19:16.453922 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 03:19:16.475822 (dockerd)[1789]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 03:19:16.917789 dockerd[1789]: time="2025-05-27T03:19:16.917621863Z" level=info msg="Starting up" May 27 03:19:16.918518 dockerd[1789]: time="2025-05-27T03:19:16.918481005Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 03:19:17.472157 dockerd[1789]: time="2025-05-27T03:19:17.472072524Z" level=info msg="Loading containers: start." May 27 03:19:17.483471 kernel: Initializing XFRM netlink socket May 27 03:19:17.746219 systemd-networkd[1475]: docker0: Link UP May 27 03:19:17.752092 dockerd[1789]: time="2025-05-27T03:19:17.752050022Z" level=info msg="Loading containers: done." May 27 03:19:17.774452 dockerd[1789]: time="2025-05-27T03:19:17.774387751Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 03:19:17.774601 dockerd[1789]: time="2025-05-27T03:19:17.774505642Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 03:19:17.774634 dockerd[1789]: time="2025-05-27T03:19:17.774620558Z" level=info msg="Initializing buildkit" May 27 03:19:17.810248 dockerd[1789]: time="2025-05-27T03:19:17.810190176Z" level=info msg="Completed buildkit initialization" May 27 03:19:17.816821 dockerd[1789]: time="2025-05-27T03:19:17.816752906Z" level=info msg="Daemon has completed initialization" May 27 03:19:17.816959 dockerd[1789]: time="2025-05-27T03:19:17.816860398Z" level=info msg="API listen on /run/docker.sock" May 27 03:19:17.817053 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 03:19:18.663565 containerd[1564]: time="2025-05-27T03:19:18.663513336Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\"" May 27 03:19:19.333028 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4141593279.mount: Deactivated successfully. May 27 03:19:20.525184 containerd[1564]: time="2025-05-27T03:19:20.525110761Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:20.525902 containerd[1564]: time="2025-05-27T03:19:20.525867731Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.5: active requests=0, bytes read=28797811" May 27 03:19:20.527124 containerd[1564]: time="2025-05-27T03:19:20.527088320Z" level=info msg="ImageCreate event name:\"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:20.529468 containerd[1564]: time="2025-05-27T03:19:20.529432317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:20.530276 containerd[1564]: time="2025-05-27T03:19:20.530215236Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.5\" with image id \"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\", size \"28794611\" in 1.866652506s" May 27 03:19:20.530276 containerd[1564]: time="2025-05-27T03:19:20.530268365Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\" returns image reference \"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\"" May 27 03:19:20.531338 containerd[1564]: time="2025-05-27T03:19:20.531297335Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\"" May 27 03:19:21.932856 containerd[1564]: time="2025-05-27T03:19:21.932787240Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:21.934181 containerd[1564]: time="2025-05-27T03:19:21.934152431Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.5: active requests=0, bytes read=24782523" May 27 03:19:21.935655 containerd[1564]: time="2025-05-27T03:19:21.935584276Z" level=info msg="ImageCreate event name:\"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:21.938885 containerd[1564]: time="2025-05-27T03:19:21.938814946Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:21.939684 containerd[1564]: time="2025-05-27T03:19:21.939633171Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.5\" with image id \"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\", size \"26384363\" in 1.408303835s" May 27 03:19:21.939684 containerd[1564]: time="2025-05-27T03:19:21.939680109Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\" returns image reference \"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\"" May 27 03:19:21.940368 containerd[1564]: time="2025-05-27T03:19:21.940338273Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\"" May 27 03:19:23.837493 containerd[1564]: time="2025-05-27T03:19:23.837410178Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:23.838559 containerd[1564]: time="2025-05-27T03:19:23.838535098Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.5: active requests=0, bytes read=19176063" May 27 03:19:23.840116 containerd[1564]: time="2025-05-27T03:19:23.840061882Z" level=info msg="ImageCreate event name:\"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:23.842649 containerd[1564]: time="2025-05-27T03:19:23.842597548Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:23.843491 containerd[1564]: time="2025-05-27T03:19:23.843448985Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.5\" with image id \"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\", size \"20777921\" in 1.903064175s" May 27 03:19:23.843491 containerd[1564]: time="2025-05-27T03:19:23.843484832Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\" returns image reference \"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\"" May 27 03:19:23.844010 containerd[1564]: time="2025-05-27T03:19:23.843984920Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\"" May 27 03:19:24.823332 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 03:19:24.825573 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:19:25.205837 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:19:25.227802 (kubelet)[2072]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:19:25.356235 kubelet[2072]: E0527 03:19:25.356145 2072 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:19:25.364386 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:19:25.364696 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:19:25.365153 systemd[1]: kubelet.service: Consumed 360ms CPU time, 111.2M memory peak. May 27 03:19:25.660770 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount185703222.mount: Deactivated successfully. May 27 03:19:26.481551 containerd[1564]: time="2025-05-27T03:19:26.481482973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:26.482232 containerd[1564]: time="2025-05-27T03:19:26.482185961Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.5: active requests=0, bytes read=30892872" May 27 03:19:26.483246 containerd[1564]: time="2025-05-27T03:19:26.483210443Z" level=info msg="ImageCreate event name:\"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:26.485120 containerd[1564]: time="2025-05-27T03:19:26.485087033Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:26.485600 containerd[1564]: time="2025-05-27T03:19:26.485569979Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.5\" with image id \"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\", repo tag \"registry.k8s.io/kube-proxy:v1.32.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\", size \"30891891\" in 2.641560122s" May 27 03:19:26.485629 containerd[1564]: time="2025-05-27T03:19:26.485600366Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\" returns image reference \"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\"" May 27 03:19:26.486249 containerd[1564]: time="2025-05-27T03:19:26.486203387Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 03:19:26.994797 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3223432558.mount: Deactivated successfully. May 27 03:19:28.234454 containerd[1564]: time="2025-05-27T03:19:28.234367623Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:28.235114 containerd[1564]: time="2025-05-27T03:19:28.235088646Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" May 27 03:19:28.236219 containerd[1564]: time="2025-05-27T03:19:28.236185734Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:28.239365 containerd[1564]: time="2025-05-27T03:19:28.239321355Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:28.240378 containerd[1564]: time="2025-05-27T03:19:28.240340286Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.754093689s" May 27 03:19:28.240467 containerd[1564]: time="2025-05-27T03:19:28.240377606Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 27 03:19:28.240967 containerd[1564]: time="2025-05-27T03:19:28.240938519Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 03:19:28.705561 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2645257029.mount: Deactivated successfully. May 27 03:19:28.711726 containerd[1564]: time="2025-05-27T03:19:28.711676488Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:19:28.712470 containerd[1564]: time="2025-05-27T03:19:28.712421315Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 27 03:19:28.713503 containerd[1564]: time="2025-05-27T03:19:28.713465294Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:19:28.716730 containerd[1564]: time="2025-05-27T03:19:28.716692818Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 475.724292ms" May 27 03:19:28.716730 containerd[1564]: time="2025-05-27T03:19:28.716728505Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 27 03:19:28.717120 containerd[1564]: time="2025-05-27T03:19:28.717075876Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:19:28.717224 containerd[1564]: time="2025-05-27T03:19:28.717172357Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 27 03:19:29.266111 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2761713740.mount: Deactivated successfully. May 27 03:19:31.913760 containerd[1564]: time="2025-05-27T03:19:31.913672087Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:31.914462 containerd[1564]: time="2025-05-27T03:19:31.914344188Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551360" May 27 03:19:31.915626 containerd[1564]: time="2025-05-27T03:19:31.915599562Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:31.918322 containerd[1564]: time="2025-05-27T03:19:31.918283386Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:31.919265 containerd[1564]: time="2025-05-27T03:19:31.919221636Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.202020715s" May 27 03:19:31.919265 containerd[1564]: time="2025-05-27T03:19:31.919260309Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" May 27 03:19:35.131611 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:19:35.131856 systemd[1]: kubelet.service: Consumed 360ms CPU time, 111.2M memory peak. May 27 03:19:35.134294 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:19:35.161964 systemd[1]: Reload requested from client PID 2229 ('systemctl') (unit session-7.scope)... May 27 03:19:35.161981 systemd[1]: Reloading... May 27 03:19:35.258463 zram_generator::config[2274]: No configuration found. May 27 03:19:35.510685 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:19:35.631011 systemd[1]: Reloading finished in 468 ms. May 27 03:19:35.699179 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 03:19:35.699281 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 03:19:35.699610 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:19:35.699653 systemd[1]: kubelet.service: Consumed 152ms CPU time, 98.3M memory peak. May 27 03:19:35.701317 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:19:35.880970 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:19:35.896897 (kubelet)[2319]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 03:19:35.942762 kubelet[2319]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:19:35.942762 kubelet[2319]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 03:19:35.942762 kubelet[2319]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:19:35.943202 kubelet[2319]: I0527 03:19:35.942858 2319 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 03:19:36.127548 kubelet[2319]: I0527 03:19:36.127502 2319 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 03:19:36.128499 kubelet[2319]: I0527 03:19:36.127693 2319 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 03:19:36.128499 kubelet[2319]: I0527 03:19:36.128348 2319 server.go:954] "Client rotation is on, will bootstrap in background" May 27 03:19:36.159552 kubelet[2319]: E0527 03:19:36.159397 2319 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.98:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:36.163315 kubelet[2319]: I0527 03:19:36.163256 2319 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:19:36.169548 kubelet[2319]: I0527 03:19:36.169515 2319 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 03:19:36.175281 kubelet[2319]: I0527 03:19:36.175240 2319 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 03:19:36.176488 kubelet[2319]: I0527 03:19:36.176427 2319 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 03:19:36.176670 kubelet[2319]: I0527 03:19:36.176482 2319 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 03:19:36.176799 kubelet[2319]: I0527 03:19:36.176682 2319 topology_manager.go:138] "Creating topology manager with none policy" May 27 03:19:36.176799 kubelet[2319]: I0527 03:19:36.176692 2319 container_manager_linux.go:304] "Creating device plugin manager" May 27 03:19:36.176909 kubelet[2319]: I0527 03:19:36.176883 2319 state_mem.go:36] "Initialized new in-memory state store" May 27 03:19:36.179771 kubelet[2319]: I0527 03:19:36.179734 2319 kubelet.go:446] "Attempting to sync node with API server" May 27 03:19:36.179771 kubelet[2319]: I0527 03:19:36.179768 2319 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 03:19:36.179853 kubelet[2319]: I0527 03:19:36.179796 2319 kubelet.go:352] "Adding apiserver pod source" May 27 03:19:36.179853 kubelet[2319]: I0527 03:19:36.179817 2319 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 03:19:36.185827 kubelet[2319]: W0527 03:19:36.185736 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.98:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:36.185962 kubelet[2319]: E0527 03:19:36.185939 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.98:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:36.186043 kubelet[2319]: W0527 03:19:36.185799 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.98:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:36.186186 kubelet[2319]: E0527 03:19:36.186162 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.98:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:36.187771 kubelet[2319]: I0527 03:19:36.187715 2319 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 03:19:36.188414 kubelet[2319]: I0527 03:19:36.188372 2319 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 03:19:36.189122 kubelet[2319]: W0527 03:19:36.189095 2319 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 03:19:36.191365 kubelet[2319]: I0527 03:19:36.191327 2319 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 03:19:36.191419 kubelet[2319]: I0527 03:19:36.191373 2319 server.go:1287] "Started kubelet" May 27 03:19:36.192562 kubelet[2319]: I0527 03:19:36.191967 2319 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 03:19:36.192562 kubelet[2319]: I0527 03:19:36.192400 2319 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 03:19:36.192562 kubelet[2319]: I0527 03:19:36.192493 2319 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 03:19:36.193152 kubelet[2319]: I0527 03:19:36.193109 2319 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 03:19:36.193673 kubelet[2319]: I0527 03:19:36.193649 2319 server.go:479] "Adding debug handlers to kubelet server" May 27 03:19:36.193798 kubelet[2319]: I0527 03:19:36.193771 2319 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 03:19:36.197063 kubelet[2319]: E0527 03:19:36.196838 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:36.197063 kubelet[2319]: I0527 03:19:36.196882 2319 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 03:19:36.197063 kubelet[2319]: I0527 03:19:36.197052 2319 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 03:19:36.197176 kubelet[2319]: I0527 03:19:36.197113 2319 reconciler.go:26] "Reconciler: start to sync state" May 27 03:19:36.197471 kubelet[2319]: W0527 03:19:36.197414 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.98:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:36.197522 kubelet[2319]: E0527 03:19:36.197478 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.98:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:36.198424 kubelet[2319]: E0527 03:19:36.197957 2319 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 03:19:36.198424 kubelet[2319]: I0527 03:19:36.198110 2319 factory.go:221] Registration of the systemd container factory successfully May 27 03:19:36.198424 kubelet[2319]: I0527 03:19:36.198175 2319 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 03:19:36.198819 kubelet[2319]: E0527 03:19:36.198767 2319 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.98:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.98:6443: connect: connection refused" interval="200ms" May 27 03:19:36.199402 kubelet[2319]: I0527 03:19:36.199371 2319 factory.go:221] Registration of the containerd container factory successfully May 27 03:19:36.200282 kubelet[2319]: E0527 03:19:36.199191 2319 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.98:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.98:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1843441e0c068818 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-27 03:19:36.191346712 +0000 UTC m=+0.287526248,LastTimestamp:2025-05-27 03:19:36.191346712 +0000 UTC m=+0.287526248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 27 03:19:36.213602 kubelet[2319]: I0527 03:19:36.213397 2319 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 03:19:36.214787 kubelet[2319]: I0527 03:19:36.214763 2319 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 03:19:36.215028 kubelet[2319]: I0527 03:19:36.214974 2319 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 03:19:36.215028 kubelet[2319]: I0527 03:19:36.215003 2319 state_mem.go:36] "Initialized new in-memory state store" May 27 03:19:36.215209 kubelet[2319]: I0527 03:19:36.215119 2319 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 03:19:36.215209 kubelet[2319]: I0527 03:19:36.215179 2319 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 03:19:36.215404 kubelet[2319]: I0527 03:19:36.215384 2319 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 03:19:36.215404 kubelet[2319]: I0527 03:19:36.215403 2319 kubelet.go:2382] "Starting kubelet main sync loop" May 27 03:19:36.215531 kubelet[2319]: E0527 03:19:36.215513 2319 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 03:19:36.216284 kubelet[2319]: W0527 03:19:36.216240 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.98:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:36.216350 kubelet[2319]: E0527 03:19:36.216303 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.98:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:36.296965 kubelet[2319]: E0527 03:19:36.296921 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:36.316341 kubelet[2319]: E0527 03:19:36.316295 2319 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 27 03:19:36.397635 kubelet[2319]: E0527 03:19:36.397616 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:36.400225 kubelet[2319]: E0527 03:19:36.400177 2319 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.98:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.98:6443: connect: connection refused" interval="400ms" May 27 03:19:36.498142 kubelet[2319]: E0527 03:19:36.498026 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:36.517419 kubelet[2319]: E0527 03:19:36.517377 2319 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 27 03:19:36.598815 kubelet[2319]: E0527 03:19:36.598759 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:36.611736 kubelet[2319]: I0527 03:19:36.611712 2319 policy_none.go:49] "None policy: Start" May 27 03:19:36.611807 kubelet[2319]: I0527 03:19:36.611751 2319 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 03:19:36.611807 kubelet[2319]: I0527 03:19:36.611772 2319 state_mem.go:35] "Initializing new in-memory state store" May 27 03:19:36.618996 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 03:19:36.631592 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 03:19:36.634592 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 03:19:36.651414 kubelet[2319]: I0527 03:19:36.651375 2319 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 03:19:36.651791 kubelet[2319]: I0527 03:19:36.651666 2319 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 03:19:36.651791 kubelet[2319]: I0527 03:19:36.651693 2319 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 03:19:36.652022 kubelet[2319]: I0527 03:19:36.651992 2319 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 03:19:36.653020 kubelet[2319]: E0527 03:19:36.652999 2319 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 03:19:36.653074 kubelet[2319]: E0527 03:19:36.653055 2319 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 27 03:19:36.754051 kubelet[2319]: I0527 03:19:36.753924 2319 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 27 03:19:36.754341 kubelet[2319]: E0527 03:19:36.754298 2319 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.98:6443/api/v1/nodes\": dial tcp 10.0.0.98:6443: connect: connection refused" node="localhost" May 27 03:19:36.801142 kubelet[2319]: E0527 03:19:36.801094 2319 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.98:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.98:6443: connect: connection refused" interval="800ms" May 27 03:19:36.926614 systemd[1]: Created slice kubepods-burstable-podfb923a5bef11701fd32845e830b20b4d.slice - libcontainer container kubepods-burstable-podfb923a5bef11701fd32845e830b20b4d.slice. May 27 03:19:36.941267 kubelet[2319]: E0527 03:19:36.941243 2319 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 27 03:19:36.943769 systemd[1]: Created slice kubepods-burstable-pod7c751acbcd1525da2f1a64e395f86bdd.slice - libcontainer container kubepods-burstable-pod7c751acbcd1525da2f1a64e395f86bdd.slice. May 27 03:19:36.955317 kubelet[2319]: I0527 03:19:36.955276 2319 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 27 03:19:36.955773 kubelet[2319]: E0527 03:19:36.955740 2319 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.98:6443/api/v1/nodes\": dial tcp 10.0.0.98:6443: connect: connection refused" node="localhost" May 27 03:19:36.958696 kubelet[2319]: E0527 03:19:36.958678 2319 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 27 03:19:36.961499 systemd[1]: Created slice kubepods-burstable-pod447e79232307504a6964f3be51e3d64d.slice - libcontainer container kubepods-burstable-pod447e79232307504a6964f3be51e3d64d.slice. May 27 03:19:36.963210 kubelet[2319]: E0527 03:19:36.963180 2319 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 27 03:19:37.002811 kubelet[2319]: I0527 03:19:37.002783 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb923a5bef11701fd32845e830b20b4d-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb923a5bef11701fd32845e830b20b4d\") " pod="kube-system/kube-apiserver-localhost" May 27 03:19:37.002874 kubelet[2319]: I0527 03:19:37.002813 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:37.002874 kubelet[2319]: I0527 03:19:37.002836 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:37.002874 kubelet[2319]: I0527 03:19:37.002852 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:37.002874 kubelet[2319]: I0527 03:19:37.002868 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb923a5bef11701fd32845e830b20b4d-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb923a5bef11701fd32845e830b20b4d\") " pod="kube-system/kube-apiserver-localhost" May 27 03:19:37.003056 kubelet[2319]: I0527 03:19:37.002908 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb923a5bef11701fd32845e830b20b4d-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"fb923a5bef11701fd32845e830b20b4d\") " pod="kube-system/kube-apiserver-localhost" May 27 03:19:37.003056 kubelet[2319]: I0527 03:19:37.002933 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:37.003056 kubelet[2319]: I0527 03:19:37.002958 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:37.003056 kubelet[2319]: I0527 03:19:37.002976 2319 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/447e79232307504a6964f3be51e3d64d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"447e79232307504a6964f3be51e3d64d\") " pod="kube-system/kube-scheduler-localhost" May 27 03:19:37.204118 kubelet[2319]: W0527 03:19:37.203990 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.98:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:37.204118 kubelet[2319]: E0527 03:19:37.204049 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.98:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:37.242941 containerd[1564]: time="2025-05-27T03:19:37.242900747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:fb923a5bef11701fd32845e830b20b4d,Namespace:kube-system,Attempt:0,}" May 27 03:19:37.259284 containerd[1564]: time="2025-05-27T03:19:37.259262236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:7c751acbcd1525da2f1a64e395f86bdd,Namespace:kube-system,Attempt:0,}" May 27 03:19:37.263753 containerd[1564]: time="2025-05-27T03:19:37.263726760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:447e79232307504a6964f3be51e3d64d,Namespace:kube-system,Attempt:0,}" May 27 03:19:37.357314 kubelet[2319]: I0527 03:19:37.357266 2319 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 27 03:19:37.357640 kubelet[2319]: E0527 03:19:37.357607 2319 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.98:6443/api/v1/nodes\": dial tcp 10.0.0.98:6443: connect: connection refused" node="localhost" May 27 03:19:37.618881 kubelet[2319]: E0527 03:19:37.618713 2319 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.98:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.98:6443: connect: connection refused" interval="1.6s" May 27 03:19:37.619177 kubelet[2319]: W0527 03:19:37.619079 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.98:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:37.619278 kubelet[2319]: E0527 03:19:37.619238 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.98:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:37.628926 containerd[1564]: time="2025-05-27T03:19:37.628852632Z" level=info msg="connecting to shim 8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e" address="unix:///run/containerd/s/8dc7486290786f05144004d574741b4efb81d1faf6173dc77fa72dfc50fc544f" namespace=k8s.io protocol=ttrpc version=3 May 27 03:19:37.629732 containerd[1564]: time="2025-05-27T03:19:37.629688871Z" level=info msg="connecting to shim 6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34" address="unix:///run/containerd/s/1b8600ffa0a21924502213ef56dbba44dce6ef4cf2965ee7107286f2897818ca" namespace=k8s.io protocol=ttrpc version=3 May 27 03:19:37.634958 containerd[1564]: time="2025-05-27T03:19:37.634737591Z" level=info msg="connecting to shim 102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989" address="unix:///run/containerd/s/580cfa4a30c519540ba36e94a39e1544b9accbe07c4fa64a8118b695b835048d" namespace=k8s.io protocol=ttrpc version=3 May 27 03:19:37.699116 kubelet[2319]: W0527 03:19:37.699000 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.98:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:37.699116 kubelet[2319]: E0527 03:19:37.699114 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.98:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:37.752731 systemd[1]: Started cri-containerd-6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34.scope - libcontainer container 6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34. May 27 03:19:37.758518 systemd[1]: Started cri-containerd-102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989.scope - libcontainer container 102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989. May 27 03:19:37.760140 systemd[1]: Started cri-containerd-8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e.scope - libcontainer container 8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e. May 27 03:19:37.785497 kubelet[2319]: W0527 03:19:37.785334 2319 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.98:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.98:6443: connect: connection refused May 27 03:19:37.785626 kubelet[2319]: E0527 03:19:37.785564 2319 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.98:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.98:6443: connect: connection refused" logger="UnhandledError" May 27 03:19:37.849558 containerd[1564]: time="2025-05-27T03:19:37.849503260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:fb923a5bef11701fd32845e830b20b4d,Namespace:kube-system,Attempt:0,} returns sandbox id \"8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e\"" May 27 03:19:37.852249 containerd[1564]: time="2025-05-27T03:19:37.852220366Z" level=info msg="CreateContainer within sandbox \"8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 03:19:37.894517 containerd[1564]: time="2025-05-27T03:19:37.894396887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:447e79232307504a6964f3be51e3d64d,Namespace:kube-system,Attempt:0,} returns sandbox id \"6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34\"" May 27 03:19:37.896244 containerd[1564]: time="2025-05-27T03:19:37.896214657Z" level=info msg="CreateContainer within sandbox \"6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 03:19:37.932174 containerd[1564]: time="2025-05-27T03:19:37.932125946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:7c751acbcd1525da2f1a64e395f86bdd,Namespace:kube-system,Attempt:0,} returns sandbox id \"102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989\"" May 27 03:19:37.934045 containerd[1564]: time="2025-05-27T03:19:37.934014909Z" level=info msg="CreateContainer within sandbox \"102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 03:19:38.067868 containerd[1564]: time="2025-05-27T03:19:38.067818677Z" level=info msg="Container aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618: CDI devices from CRI Config.CDIDevices: []" May 27 03:19:38.089551 containerd[1564]: time="2025-05-27T03:19:38.089519471Z" level=info msg="Container 2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e: CDI devices from CRI Config.CDIDevices: []" May 27 03:19:38.121968 containerd[1564]: time="2025-05-27T03:19:38.121939853Z" level=info msg="Container eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63: CDI devices from CRI Config.CDIDevices: []" May 27 03:19:38.159399 kubelet[2319]: I0527 03:19:38.159316 2319 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 27 03:19:38.159819 kubelet[2319]: E0527 03:19:38.159737 2319 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.98:6443/api/v1/nodes\": dial tcp 10.0.0.98:6443: connect: connection refused" node="localhost" May 27 03:19:38.185641 containerd[1564]: time="2025-05-27T03:19:38.185605862Z" level=info msg="CreateContainer within sandbox \"8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618\"" May 27 03:19:38.186157 containerd[1564]: time="2025-05-27T03:19:38.186133311Z" level=info msg="StartContainer for \"aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618\"" May 27 03:19:38.187218 containerd[1564]: time="2025-05-27T03:19:38.187188501Z" level=info msg="connecting to shim aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618" address="unix:///run/containerd/s/8dc7486290786f05144004d574741b4efb81d1faf6173dc77fa72dfc50fc544f" protocol=ttrpc version=3 May 27 03:19:38.207569 systemd[1]: Started cri-containerd-aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618.scope - libcontainer container aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618. May 27 03:19:38.252329 containerd[1564]: time="2025-05-27T03:19:38.252206796Z" level=info msg="CreateContainer within sandbox \"102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63\"" May 27 03:19:38.253233 containerd[1564]: time="2025-05-27T03:19:38.253207002Z" level=info msg="StartContainer for \"eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63\"" May 27 03:19:38.254450 containerd[1564]: time="2025-05-27T03:19:38.254410380Z" level=info msg="connecting to shim eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63" address="unix:///run/containerd/s/580cfa4a30c519540ba36e94a39e1544b9accbe07c4fa64a8118b695b835048d" protocol=ttrpc version=3 May 27 03:19:38.269976 kubelet[2319]: E0527 03:19:38.269858 2319 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.98:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.98:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1843441e0c068818 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-27 03:19:36.191346712 +0000 UTC m=+0.287526248,LastTimestamp:2025-05-27 03:19:36.191346712 +0000 UTC m=+0.287526248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 27 03:19:38.278864 systemd[1]: Started cri-containerd-eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63.scope - libcontainer container eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63. May 27 03:19:38.309153 containerd[1564]: time="2025-05-27T03:19:38.309105272Z" level=info msg="CreateContainer within sandbox \"6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e\"" May 27 03:19:38.309679 containerd[1564]: time="2025-05-27T03:19:38.309645575Z" level=info msg="StartContainer for \"aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618\" returns successfully" May 27 03:19:38.310752 containerd[1564]: time="2025-05-27T03:19:38.310722886Z" level=info msg="StartContainer for \"2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e\"" May 27 03:19:38.311948 containerd[1564]: time="2025-05-27T03:19:38.311920262Z" level=info msg="connecting to shim 2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e" address="unix:///run/containerd/s/1b8600ffa0a21924502213ef56dbba44dce6ef4cf2965ee7107286f2897818ca" protocol=ttrpc version=3 May 27 03:19:38.359694 systemd[1]: Started cri-containerd-2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e.scope - libcontainer container 2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e. May 27 03:19:38.380226 containerd[1564]: time="2025-05-27T03:19:38.380173936Z" level=info msg="StartContainer for \"eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63\" returns successfully" May 27 03:19:38.451415 containerd[1564]: time="2025-05-27T03:19:38.451229496Z" level=info msg="StartContainer for \"2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e\" returns successfully" May 27 03:19:39.228744 kubelet[2319]: E0527 03:19:39.228509 2319 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 27 03:19:39.230042 kubelet[2319]: E0527 03:19:39.230021 2319 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 27 03:19:39.230542 kubelet[2319]: E0527 03:19:39.230518 2319 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 27 03:19:39.666011 kubelet[2319]: E0527 03:19:39.665892 2319 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 27 03:19:39.761781 kubelet[2319]: I0527 03:19:39.761742 2319 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 27 03:19:39.771677 kubelet[2319]: I0527 03:19:39.771613 2319 kubelet_node_status.go:78] "Successfully registered node" node="localhost" May 27 03:19:39.771677 kubelet[2319]: E0527 03:19:39.771647 2319 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" May 27 03:19:39.779412 kubelet[2319]: E0527 03:19:39.779375 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:39.880239 kubelet[2319]: E0527 03:19:39.880174 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:39.980624 kubelet[2319]: E0527 03:19:39.980469 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:40.081590 kubelet[2319]: E0527 03:19:40.081537 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:40.182419 kubelet[2319]: E0527 03:19:40.182356 2319 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:40.231873 kubelet[2319]: I0527 03:19:40.231762 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 27 03:19:40.231873 kubelet[2319]: I0527 03:19:40.231786 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 27 03:19:40.232310 kubelet[2319]: I0527 03:19:40.231929 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 27 03:19:40.238114 kubelet[2319]: E0527 03:19:40.238068 2319 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" May 27 03:19:40.238326 kubelet[2319]: E0527 03:19:40.238289 2319 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" May 27 03:19:40.238475 kubelet[2319]: E0527 03:19:40.238068 2319 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" May 27 03:19:40.298666 kubelet[2319]: I0527 03:19:40.298609 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 27 03:19:40.301069 kubelet[2319]: E0527 03:19:40.301034 2319 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" May 27 03:19:40.301069 kubelet[2319]: I0527 03:19:40.301059 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 27 03:19:40.302785 kubelet[2319]: E0527 03:19:40.302755 2319 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" May 27 03:19:40.302785 kubelet[2319]: I0527 03:19:40.302774 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 27 03:19:40.304146 kubelet[2319]: E0527 03:19:40.304089 2319 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" May 27 03:19:41.183961 kubelet[2319]: I0527 03:19:41.183891 2319 apiserver.go:52] "Watching apiserver" May 27 03:19:41.197718 kubelet[2319]: I0527 03:19:41.197675 2319 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 03:19:41.233219 kubelet[2319]: I0527 03:19:41.233171 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 27 03:19:41.234804 kubelet[2319]: I0527 03:19:41.233243 2319 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 27 03:19:43.827790 systemd[1]: Reload requested from client PID 2596 ('systemctl') (unit session-7.scope)... May 27 03:19:43.827806 systemd[1]: Reloading... May 27 03:19:43.955471 zram_generator::config[2635]: No configuration found. May 27 03:19:44.122257 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:19:44.252591 systemd[1]: Reloading finished in 424 ms. May 27 03:19:44.283245 kubelet[2319]: I0527 03:19:44.283200 2319 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:19:44.283770 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:19:44.311310 systemd[1]: kubelet.service: Deactivated successfully. May 27 03:19:44.311718 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:19:44.311791 systemd[1]: kubelet.service: Consumed 832ms CPU time, 131.4M memory peak. May 27 03:19:44.314201 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:19:44.533718 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:19:44.537972 (kubelet)[2684]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 03:19:44.577173 kubelet[2684]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:19:44.577173 kubelet[2684]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 03:19:44.577173 kubelet[2684]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:19:44.577636 kubelet[2684]: I0527 03:19:44.577247 2684 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 03:19:44.586202 kubelet[2684]: I0527 03:19:44.586145 2684 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 03:19:44.586202 kubelet[2684]: I0527 03:19:44.586175 2684 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 03:19:44.586473 kubelet[2684]: I0527 03:19:44.586430 2684 server.go:954] "Client rotation is on, will bootstrap in background" May 27 03:19:44.587675 kubelet[2684]: I0527 03:19:44.587645 2684 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 03:19:44.590068 kubelet[2684]: I0527 03:19:44.589971 2684 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:19:44.596729 kubelet[2684]: I0527 03:19:44.596690 2684 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 03:19:44.601742 kubelet[2684]: I0527 03:19:44.601698 2684 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 03:19:44.602385 kubelet[2684]: I0527 03:19:44.602341 2684 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 03:19:44.602758 kubelet[2684]: I0527 03:19:44.602477 2684 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 03:19:44.602758 kubelet[2684]: I0527 03:19:44.602747 2684 topology_manager.go:138] "Creating topology manager with none policy" May 27 03:19:44.602758 kubelet[2684]: I0527 03:19:44.602759 2684 container_manager_linux.go:304] "Creating device plugin manager" May 27 03:19:44.603037 kubelet[2684]: I0527 03:19:44.602827 2684 state_mem.go:36] "Initialized new in-memory state store" May 27 03:19:44.603064 kubelet[2684]: I0527 03:19:44.603041 2684 kubelet.go:446] "Attempting to sync node with API server" May 27 03:19:44.603089 kubelet[2684]: I0527 03:19:44.603070 2684 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 03:19:44.603110 kubelet[2684]: I0527 03:19:44.603100 2684 kubelet.go:352] "Adding apiserver pod source" May 27 03:19:44.603134 kubelet[2684]: I0527 03:19:44.603115 2684 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 03:19:44.604236 kubelet[2684]: I0527 03:19:44.604001 2684 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 03:19:44.604733 kubelet[2684]: I0527 03:19:44.604706 2684 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 03:19:44.605496 kubelet[2684]: I0527 03:19:44.605470 2684 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 03:19:44.605716 kubelet[2684]: I0527 03:19:44.605689 2684 server.go:1287] "Started kubelet" May 27 03:19:44.609782 kubelet[2684]: I0527 03:19:44.609749 2684 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 03:19:44.611563 kubelet[2684]: I0527 03:19:44.611299 2684 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 03:19:44.612427 kubelet[2684]: E0527 03:19:44.612369 2684 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 03:19:44.612427 kubelet[2684]: I0527 03:19:44.612430 2684 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 03:19:44.613371 kubelet[2684]: I0527 03:19:44.612771 2684 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 03:19:44.613371 kubelet[2684]: I0527 03:19:44.613030 2684 reconciler.go:26] "Reconciler: start to sync state" May 27 03:19:44.614025 kubelet[2684]: I0527 03:19:44.613996 2684 factory.go:221] Registration of the systemd container factory successfully May 27 03:19:44.614153 kubelet[2684]: I0527 03:19:44.614125 2684 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 03:19:44.614971 kubelet[2684]: I0527 03:19:44.614935 2684 server.go:479] "Adding debug handlers to kubelet server" May 27 03:19:44.615847 kubelet[2684]: I0527 03:19:44.615807 2684 factory.go:221] Registration of the containerd container factory successfully May 27 03:19:44.617263 kubelet[2684]: I0527 03:19:44.617181 2684 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 03:19:44.617712 kubelet[2684]: I0527 03:19:44.617672 2684 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 03:19:44.618103 kubelet[2684]: I0527 03:19:44.618067 2684 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 03:19:44.628335 kubelet[2684]: I0527 03:19:44.626645 2684 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 03:19:44.628883 kubelet[2684]: I0527 03:19:44.628826 2684 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 03:19:44.628883 kubelet[2684]: I0527 03:19:44.628867 2684 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 03:19:44.628883 kubelet[2684]: I0527 03:19:44.628894 2684 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 03:19:44.628883 kubelet[2684]: I0527 03:19:44.628904 2684 kubelet.go:2382] "Starting kubelet main sync loop" May 27 03:19:44.629370 kubelet[2684]: E0527 03:19:44.628964 2684 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 03:19:44.660617 kubelet[2684]: I0527 03:19:44.660556 2684 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 03:19:44.660617 kubelet[2684]: I0527 03:19:44.660577 2684 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 03:19:44.660617 kubelet[2684]: I0527 03:19:44.660605 2684 state_mem.go:36] "Initialized new in-memory state store" May 27 03:19:44.660872 kubelet[2684]: I0527 03:19:44.660788 2684 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 03:19:44.660872 kubelet[2684]: I0527 03:19:44.660801 2684 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 03:19:44.660872 kubelet[2684]: I0527 03:19:44.660822 2684 policy_none.go:49] "None policy: Start" May 27 03:19:44.660872 kubelet[2684]: I0527 03:19:44.660833 2684 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 03:19:44.660872 kubelet[2684]: I0527 03:19:44.660843 2684 state_mem.go:35] "Initializing new in-memory state store" May 27 03:19:44.661005 kubelet[2684]: I0527 03:19:44.660962 2684 state_mem.go:75] "Updated machine memory state" May 27 03:19:44.665706 kubelet[2684]: I0527 03:19:44.665676 2684 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 03:19:44.666129 kubelet[2684]: I0527 03:19:44.666098 2684 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 03:19:44.666272 kubelet[2684]: I0527 03:19:44.666118 2684 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 03:19:44.666376 kubelet[2684]: I0527 03:19:44.666356 2684 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 03:19:44.667843 kubelet[2684]: E0527 03:19:44.667634 2684 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 03:19:44.730140 kubelet[2684]: I0527 03:19:44.730102 2684 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 27 03:19:44.730418 kubelet[2684]: I0527 03:19:44.730228 2684 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 27 03:19:44.730635 kubelet[2684]: I0527 03:19:44.730251 2684 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 27 03:19:44.768131 kubelet[2684]: I0527 03:19:44.768084 2684 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 27 03:19:44.804494 kubelet[2684]: E0527 03:19:44.804251 2684 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 27 03:19:44.804494 kubelet[2684]: E0527 03:19:44.804284 2684 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 27 03:19:44.814291 kubelet[2684]: I0527 03:19:44.814242 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb923a5bef11701fd32845e830b20b4d-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb923a5bef11701fd32845e830b20b4d\") " pod="kube-system/kube-apiserver-localhost" May 27 03:19:44.814291 kubelet[2684]: I0527 03:19:44.814273 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb923a5bef11701fd32845e830b20b4d-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb923a5bef11701fd32845e830b20b4d\") " pod="kube-system/kube-apiserver-localhost" May 27 03:19:44.814482 kubelet[2684]: I0527 03:19:44.814310 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb923a5bef11701fd32845e830b20b4d-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"fb923a5bef11701fd32845e830b20b4d\") " pod="kube-system/kube-apiserver-localhost" May 27 03:19:44.814482 kubelet[2684]: I0527 03:19:44.814352 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:44.814482 kubelet[2684]: I0527 03:19:44.814384 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:44.814482 kubelet[2684]: I0527 03:19:44.814403 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:44.814482 kubelet[2684]: I0527 03:19:44.814473 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:44.814652 kubelet[2684]: I0527 03:19:44.814495 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7c751acbcd1525da2f1a64e395f86bdd-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"7c751acbcd1525da2f1a64e395f86bdd\") " pod="kube-system/kube-controller-manager-localhost" May 27 03:19:44.814652 kubelet[2684]: I0527 03:19:44.814559 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/447e79232307504a6964f3be51e3d64d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"447e79232307504a6964f3be51e3d64d\") " pod="kube-system/kube-scheduler-localhost" May 27 03:19:44.818928 kubelet[2684]: I0527 03:19:44.818906 2684 kubelet_node_status.go:124] "Node was previously registered" node="localhost" May 27 03:19:44.819005 kubelet[2684]: I0527 03:19:44.818986 2684 kubelet_node_status.go:78] "Successfully registered node" node="localhost" May 27 03:19:45.604553 kubelet[2684]: I0527 03:19:45.604510 2684 apiserver.go:52] "Watching apiserver" May 27 03:19:45.613876 kubelet[2684]: I0527 03:19:45.613836 2684 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 03:19:45.643937 kubelet[2684]: I0527 03:19:45.643850 2684 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 27 03:19:45.644151 kubelet[2684]: I0527 03:19:45.643850 2684 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 27 03:19:45.770593 kubelet[2684]: E0527 03:19:45.770537 2684 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 27 03:19:45.770809 kubelet[2684]: E0527 03:19:45.770773 2684 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 27 03:19:45.770895 kubelet[2684]: I0527 03:19:45.770781 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.770764055 podStartE2EDuration="1.770764055s" podCreationTimestamp="2025-05-27 03:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:19:45.77051539 +0000 UTC m=+1.228555617" watchObservedRunningTime="2025-05-27 03:19:45.770764055 +0000 UTC m=+1.228804282" May 27 03:19:45.814653 kubelet[2684]: I0527 03:19:45.814566 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=4.814541643 podStartE2EDuration="4.814541643s" podCreationTimestamp="2025-05-27 03:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:19:45.814114007 +0000 UTC m=+1.272154234" watchObservedRunningTime="2025-05-27 03:19:45.814541643 +0000 UTC m=+1.272581870" May 27 03:19:45.953387 kubelet[2684]: I0527 03:19:45.953121 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=4.953100933 podStartE2EDuration="4.953100933s" podCreationTimestamp="2025-05-27 03:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:19:45.93363863 +0000 UTC m=+1.391678857" watchObservedRunningTime="2025-05-27 03:19:45.953100933 +0000 UTC m=+1.411141160" May 27 03:19:48.223271 kubelet[2684]: I0527 03:19:48.223222 2684 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 03:19:48.223729 containerd[1564]: time="2025-05-27T03:19:48.223613172Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 03:19:48.223986 kubelet[2684]: I0527 03:19:48.223803 2684 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 03:19:49.098217 systemd[1]: Created slice kubepods-besteffort-podcd2768b6_fded_47a1_8ba1_b70ae5a23bb9.slice - libcontainer container kubepods-besteffort-podcd2768b6_fded_47a1_8ba1_b70ae5a23bb9.slice. May 27 03:19:49.143659 kubelet[2684]: I0527 03:19:49.143533 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cnpb\" (UniqueName: \"kubernetes.io/projected/cd2768b6-fded-47a1-8ba1-b70ae5a23bb9-kube-api-access-5cnpb\") pod \"kube-proxy-4zk4m\" (UID: \"cd2768b6-fded-47a1-8ba1-b70ae5a23bb9\") " pod="kube-system/kube-proxy-4zk4m" May 27 03:19:49.143659 kubelet[2684]: I0527 03:19:49.143679 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd2768b6-fded-47a1-8ba1-b70ae5a23bb9-lib-modules\") pod \"kube-proxy-4zk4m\" (UID: \"cd2768b6-fded-47a1-8ba1-b70ae5a23bb9\") " pod="kube-system/kube-proxy-4zk4m" May 27 03:19:49.143857 kubelet[2684]: I0527 03:19:49.143708 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/cd2768b6-fded-47a1-8ba1-b70ae5a23bb9-kube-proxy\") pod \"kube-proxy-4zk4m\" (UID: \"cd2768b6-fded-47a1-8ba1-b70ae5a23bb9\") " pod="kube-system/kube-proxy-4zk4m" May 27 03:19:49.143857 kubelet[2684]: I0527 03:19:49.143831 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/cd2768b6-fded-47a1-8ba1-b70ae5a23bb9-xtables-lock\") pod \"kube-proxy-4zk4m\" (UID: \"cd2768b6-fded-47a1-8ba1-b70ae5a23bb9\") " pod="kube-system/kube-proxy-4zk4m" May 27 03:19:49.325737 systemd[1]: Created slice kubepods-besteffort-podb42fcb37_41c7_4c23_96c9_e822b3e8419e.slice - libcontainer container kubepods-besteffort-podb42fcb37_41c7_4c23_96c9_e822b3e8419e.slice. May 27 03:19:49.345535 kubelet[2684]: I0527 03:19:49.345469 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b42fcb37-41c7-4c23-96c9-e822b3e8419e-var-lib-calico\") pod \"tigera-operator-844669ff44-xjv4g\" (UID: \"b42fcb37-41c7-4c23-96c9-e822b3e8419e\") " pod="tigera-operator/tigera-operator-844669ff44-xjv4g" May 27 03:19:49.345535 kubelet[2684]: I0527 03:19:49.345526 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9t48\" (UniqueName: \"kubernetes.io/projected/b42fcb37-41c7-4c23-96c9-e822b3e8419e-kube-api-access-b9t48\") pod \"tigera-operator-844669ff44-xjv4g\" (UID: \"b42fcb37-41c7-4c23-96c9-e822b3e8419e\") " pod="tigera-operator/tigera-operator-844669ff44-xjv4g" May 27 03:19:49.410200 containerd[1564]: time="2025-05-27T03:19:49.410147674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4zk4m,Uid:cd2768b6-fded-47a1-8ba1-b70ae5a23bb9,Namespace:kube-system,Attempt:0,}" May 27 03:19:49.435207 containerd[1564]: time="2025-05-27T03:19:49.435166496Z" level=info msg="connecting to shim 669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12" address="unix:///run/containerd/s/06e30eefadcfb43b795c8507df2c88217154446c422176ff132c781fba1deb84" namespace=k8s.io protocol=ttrpc version=3 May 27 03:19:49.475586 systemd[1]: Started cri-containerd-669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12.scope - libcontainer container 669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12. May 27 03:19:49.503074 containerd[1564]: time="2025-05-27T03:19:49.503028563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4zk4m,Uid:cd2768b6-fded-47a1-8ba1-b70ae5a23bb9,Namespace:kube-system,Attempt:0,} returns sandbox id \"669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12\"" May 27 03:19:49.506423 containerd[1564]: time="2025-05-27T03:19:49.506381137Z" level=info msg="CreateContainer within sandbox \"669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 03:19:49.518510 containerd[1564]: time="2025-05-27T03:19:49.518462384Z" level=info msg="Container b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a: CDI devices from CRI Config.CDIDevices: []" May 27 03:19:49.528123 containerd[1564]: time="2025-05-27T03:19:49.528084386Z" level=info msg="CreateContainer within sandbox \"669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a\"" May 27 03:19:49.528565 containerd[1564]: time="2025-05-27T03:19:49.528533940Z" level=info msg="StartContainer for \"b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a\"" May 27 03:19:49.529825 containerd[1564]: time="2025-05-27T03:19:49.529790309Z" level=info msg="connecting to shim b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a" address="unix:///run/containerd/s/06e30eefadcfb43b795c8507df2c88217154446c422176ff132c781fba1deb84" protocol=ttrpc version=3 May 27 03:19:49.557586 systemd[1]: Started cri-containerd-b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a.scope - libcontainer container b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a. May 27 03:19:49.600820 containerd[1564]: time="2025-05-27T03:19:49.600762359Z" level=info msg="StartContainer for \"b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a\" returns successfully" May 27 03:19:49.629647 containerd[1564]: time="2025-05-27T03:19:49.629585313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-xjv4g,Uid:b42fcb37-41c7-4c23-96c9-e822b3e8419e,Namespace:tigera-operator,Attempt:0,}" May 27 03:19:49.651477 containerd[1564]: time="2025-05-27T03:19:49.650959095Z" level=info msg="connecting to shim 4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe" address="unix:///run/containerd/s/ec660f70a3b3a8f75523122e013b2e6bdb65196bda2c4d7a7e5aef548f805513" namespace=k8s.io protocol=ttrpc version=3 May 27 03:19:49.685577 systemd[1]: Started cri-containerd-4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe.scope - libcontainer container 4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe. May 27 03:19:49.739784 containerd[1564]: time="2025-05-27T03:19:49.739736204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-xjv4g,Uid:b42fcb37-41c7-4c23-96c9-e822b3e8419e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe\"" May 27 03:19:49.742016 containerd[1564]: time="2025-05-27T03:19:49.741909135Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 03:19:49.961092 kubelet[2684]: I0527 03:19:49.960912 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4zk4m" podStartSLOduration=0.960892169 podStartE2EDuration="960.892169ms" podCreationTimestamp="2025-05-27 03:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:19:49.665489889 +0000 UTC m=+5.123530116" watchObservedRunningTime="2025-05-27 03:19:49.960892169 +0000 UTC m=+5.418932396" May 27 03:19:50.256684 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2206370393.mount: Deactivated successfully. May 27 03:19:51.375896 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1063887758.mount: Deactivated successfully. May 27 03:19:52.633310 containerd[1564]: time="2025-05-27T03:19:52.633254489Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:52.634051 containerd[1564]: time="2025-05-27T03:19:52.633989763Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 27 03:19:52.635301 containerd[1564]: time="2025-05-27T03:19:52.635249592Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:52.637271 containerd[1564]: time="2025-05-27T03:19:52.637214079Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:19:52.637994 containerd[1564]: time="2025-05-27T03:19:52.637946908Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 2.896007186s" May 27 03:19:52.637994 containerd[1564]: time="2025-05-27T03:19:52.637991944Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 27 03:19:52.640211 containerd[1564]: time="2025-05-27T03:19:52.640169763Z" level=info msg="CreateContainer within sandbox \"4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 03:19:52.649334 containerd[1564]: time="2025-05-27T03:19:52.649280173Z" level=info msg="Container a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d: CDI devices from CRI Config.CDIDevices: []" May 27 03:19:52.655706 containerd[1564]: time="2025-05-27T03:19:52.655668718Z" level=info msg="CreateContainer within sandbox \"4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d\"" May 27 03:19:52.656234 containerd[1564]: time="2025-05-27T03:19:52.656143900Z" level=info msg="StartContainer for \"a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d\"" May 27 03:19:52.656996 containerd[1564]: time="2025-05-27T03:19:52.656965007Z" level=info msg="connecting to shim a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d" address="unix:///run/containerd/s/ec660f70a3b3a8f75523122e013b2e6bdb65196bda2c4d7a7e5aef548f805513" protocol=ttrpc version=3 May 27 03:19:52.709575 systemd[1]: Started cri-containerd-a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d.scope - libcontainer container a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d. May 27 03:19:52.742065 containerd[1564]: time="2025-05-27T03:19:52.741996306Z" level=info msg="StartContainer for \"a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d\" returns successfully" May 27 03:19:53.703817 kubelet[2684]: I0527 03:19:53.703728 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-xjv4g" podStartSLOduration=1.805037012 podStartE2EDuration="4.702721382s" podCreationTimestamp="2025-05-27 03:19:49 +0000 UTC" firstStartedPulling="2025-05-27 03:19:49.74113364 +0000 UTC m=+5.199173867" lastFinishedPulling="2025-05-27 03:19:52.638818 +0000 UTC m=+8.096858237" observedRunningTime="2025-05-27 03:19:53.702616954 +0000 UTC m=+9.160657181" watchObservedRunningTime="2025-05-27 03:19:53.702721382 +0000 UTC m=+9.160761609" May 27 03:19:56.762611 update_engine[1555]: I20250527 03:19:56.762496 1555 update_attempter.cc:509] Updating boot flags... May 27 03:19:58.511467 sudo[1769]: pam_unix(sudo:session): session closed for user root May 27 03:19:58.513232 sshd[1768]: Connection closed by 10.0.0.1 port 59484 May 27 03:19:58.513868 sshd-session[1766]: pam_unix(sshd:session): session closed for user core May 27 03:19:58.519782 systemd-logind[1549]: Session 7 logged out. Waiting for processes to exit. May 27 03:19:58.520584 systemd[1]: sshd@6-10.0.0.98:22-10.0.0.1:59484.service: Deactivated successfully. May 27 03:19:58.524468 systemd[1]: session-7.scope: Deactivated successfully. May 27 03:19:58.524802 systemd[1]: session-7.scope: Consumed 5.582s CPU time, 226.7M memory peak. May 27 03:19:58.527716 systemd-logind[1549]: Removed session 7. May 27 03:20:01.799651 systemd[1]: Created slice kubepods-besteffort-podc6db46dd_4aa2_4602_9250_ae156d5ddb5a.slice - libcontainer container kubepods-besteffort-podc6db46dd_4aa2_4602_9250_ae156d5ddb5a.slice. May 27 03:20:01.825012 kubelet[2684]: I0527 03:20:01.824953 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c6db46dd-4aa2-4602-9250-ae156d5ddb5a-typha-certs\") pod \"calico-typha-657779c6d5-kdtjc\" (UID: \"c6db46dd-4aa2-4602-9250-ae156d5ddb5a\") " pod="calico-system/calico-typha-657779c6d5-kdtjc" May 27 03:20:01.825012 kubelet[2684]: I0527 03:20:01.825003 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6db46dd-4aa2-4602-9250-ae156d5ddb5a-tigera-ca-bundle\") pod \"calico-typha-657779c6d5-kdtjc\" (UID: \"c6db46dd-4aa2-4602-9250-ae156d5ddb5a\") " pod="calico-system/calico-typha-657779c6d5-kdtjc" May 27 03:20:01.825012 kubelet[2684]: I0527 03:20:01.825030 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h6ns\" (UniqueName: \"kubernetes.io/projected/c6db46dd-4aa2-4602-9250-ae156d5ddb5a-kube-api-access-9h6ns\") pod \"calico-typha-657779c6d5-kdtjc\" (UID: \"c6db46dd-4aa2-4602-9250-ae156d5ddb5a\") " pod="calico-system/calico-typha-657779c6d5-kdtjc" May 27 03:20:02.075459 systemd[1]: Created slice kubepods-besteffort-pod381e90e6_b254_4672_9ef1_8abc6af35c34.slice - libcontainer container kubepods-besteffort-pod381e90e6_b254_4672_9ef1_8abc6af35c34.slice. May 27 03:20:02.107091 containerd[1564]: time="2025-05-27T03:20:02.107033512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-657779c6d5-kdtjc,Uid:c6db46dd-4aa2-4602-9250-ae156d5ddb5a,Namespace:calico-system,Attempt:0,}" May 27 03:20:02.127556 kubelet[2684]: I0527 03:20:02.127506 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/381e90e6-b254-4672-9ef1-8abc6af35c34-tigera-ca-bundle\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127556 kubelet[2684]: I0527 03:20:02.127547 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-var-lib-calico\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127676 kubelet[2684]: I0527 03:20:02.127565 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-cni-bin-dir\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127676 kubelet[2684]: I0527 03:20:02.127580 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-xtables-lock\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127676 kubelet[2684]: I0527 03:20:02.127596 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjx5z\" (UniqueName: \"kubernetes.io/projected/381e90e6-b254-4672-9ef1-8abc6af35c34-kube-api-access-rjx5z\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127676 kubelet[2684]: I0527 03:20:02.127622 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-var-run-calico\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127676 kubelet[2684]: I0527 03:20:02.127663 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-cni-net-dir\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127856 kubelet[2684]: I0527 03:20:02.127678 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-flexvol-driver-host\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127856 kubelet[2684]: I0527 03:20:02.127691 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-policysync\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127856 kubelet[2684]: I0527 03:20:02.127706 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-cni-log-dir\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127856 kubelet[2684]: I0527 03:20:02.127720 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/381e90e6-b254-4672-9ef1-8abc6af35c34-lib-modules\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.127856 kubelet[2684]: I0527 03:20:02.127773 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/381e90e6-b254-4672-9ef1-8abc6af35c34-node-certs\") pod \"calico-node-kzpnz\" (UID: \"381e90e6-b254-4672-9ef1-8abc6af35c34\") " pod="calico-system/calico-node-kzpnz" May 27 03:20:02.311428 kubelet[2684]: E0527 03:20:02.311342 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86zzn" podUID="e359ce44-55b5-46e1-a044-9a6f462b1bb9" May 27 03:20:02.314500 kubelet[2684]: E0527 03:20:02.314152 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.314500 kubelet[2684]: W0527 03:20:02.314188 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.314500 kubelet[2684]: E0527 03:20:02.314265 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.314956 kubelet[2684]: E0527 03:20:02.314528 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.314956 kubelet[2684]: W0527 03:20:02.314539 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.314956 kubelet[2684]: E0527 03:20:02.314548 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.314956 kubelet[2684]: E0527 03:20:02.314922 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.314956 kubelet[2684]: W0527 03:20:02.314934 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.314956 kubelet[2684]: E0527 03:20:02.314947 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.315940 kubelet[2684]: E0527 03:20:02.315919 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.315940 kubelet[2684]: W0527 03:20:02.315937 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.316032 kubelet[2684]: E0527 03:20:02.315951 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.316570 kubelet[2684]: E0527 03:20:02.316547 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.316631 kubelet[2684]: W0527 03:20:02.316568 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.316631 kubelet[2684]: E0527 03:20:02.316584 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.317522 kubelet[2684]: E0527 03:20:02.317498 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.317559 kubelet[2684]: W0527 03:20:02.317523 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.317559 kubelet[2684]: E0527 03:20:02.317541 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.317829 kubelet[2684]: E0527 03:20:02.317806 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.317829 kubelet[2684]: W0527 03:20:02.317827 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.317921 kubelet[2684]: E0527 03:20:02.317842 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.318782 kubelet[2684]: E0527 03:20:02.318756 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.318782 kubelet[2684]: W0527 03:20:02.318778 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.318869 kubelet[2684]: E0527 03:20:02.318794 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.320408 kubelet[2684]: E0527 03:20:02.320362 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.320408 kubelet[2684]: W0527 03:20:02.320390 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.320408 kubelet[2684]: E0527 03:20:02.320406 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.321373 kubelet[2684]: E0527 03:20:02.321345 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.321373 kubelet[2684]: W0527 03:20:02.321367 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.321485 kubelet[2684]: E0527 03:20:02.321383 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.322477 kubelet[2684]: E0527 03:20:02.322431 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.322477 kubelet[2684]: W0527 03:20:02.322473 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.322687 kubelet[2684]: E0527 03:20:02.322486 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.326722 kubelet[2684]: E0527 03:20:02.326574 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.326722 kubelet[2684]: W0527 03:20:02.326618 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.326722 kubelet[2684]: E0527 03:20:02.326651 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.328505 kubelet[2684]: E0527 03:20:02.328398 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.328505 kubelet[2684]: W0527 03:20:02.328420 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.328806 kubelet[2684]: E0527 03:20:02.328552 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.329933 kubelet[2684]: E0527 03:20:02.329481 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.329933 kubelet[2684]: W0527 03:20:02.329502 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.329933 kubelet[2684]: E0527 03:20:02.329518 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.330936 containerd[1564]: time="2025-05-27T03:20:02.330873983Z" level=info msg="connecting to shim 43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6" address="unix:///run/containerd/s/1f909b33cfaa4c0acd092b9e7c5d5b306d38098506339c222d2823439c3566f0" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:02.331520 kubelet[2684]: E0527 03:20:02.331492 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.331520 kubelet[2684]: W0527 03:20:02.331516 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.331645 kubelet[2684]: E0527 03:20:02.331533 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.331801 kubelet[2684]: E0527 03:20:02.331775 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.331801 kubelet[2684]: W0527 03:20:02.331798 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.331860 kubelet[2684]: E0527 03:20:02.331813 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.332109 kubelet[2684]: E0527 03:20:02.332083 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.332109 kubelet[2684]: W0527 03:20:02.332104 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.332200 kubelet[2684]: E0527 03:20:02.332117 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.332375 kubelet[2684]: E0527 03:20:02.332349 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.332375 kubelet[2684]: W0527 03:20:02.332373 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.332473 kubelet[2684]: E0527 03:20:02.332388 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.332653 kubelet[2684]: E0527 03:20:02.332631 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.332653 kubelet[2684]: W0527 03:20:02.332645 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.332732 kubelet[2684]: E0527 03:20:02.332657 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.333751 kubelet[2684]: E0527 03:20:02.333702 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.333751 kubelet[2684]: W0527 03:20:02.333718 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.333751 kubelet[2684]: E0527 03:20:02.333728 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.334066 kubelet[2684]: E0527 03:20:02.334046 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.334066 kubelet[2684]: W0527 03:20:02.334062 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.334066 kubelet[2684]: E0527 03:20:02.334073 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.334186 kubelet[2684]: I0527 03:20:02.334102 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e359ce44-55b5-46e1-a044-9a6f462b1bb9-varrun\") pod \"csi-node-driver-86zzn\" (UID: \"e359ce44-55b5-46e1-a044-9a6f462b1bb9\") " pod="calico-system/csi-node-driver-86zzn" May 27 03:20:02.335418 kubelet[2684]: E0527 03:20:02.335381 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.335418 kubelet[2684]: W0527 03:20:02.335401 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.335608 kubelet[2684]: E0527 03:20:02.335424 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.335608 kubelet[2684]: I0527 03:20:02.335595 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e359ce44-55b5-46e1-a044-9a6f462b1bb9-kubelet-dir\") pod \"csi-node-driver-86zzn\" (UID: \"e359ce44-55b5-46e1-a044-9a6f462b1bb9\") " pod="calico-system/csi-node-driver-86zzn" May 27 03:20:02.336092 kubelet[2684]: E0527 03:20:02.336064 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.336092 kubelet[2684]: W0527 03:20:02.336089 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.336190 kubelet[2684]: E0527 03:20:02.336107 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.338481 kubelet[2684]: E0527 03:20:02.338410 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.338481 kubelet[2684]: W0527 03:20:02.338476 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.338620 kubelet[2684]: E0527 03:20:02.338545 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.339090 kubelet[2684]: E0527 03:20:02.339043 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.339090 kubelet[2684]: W0527 03:20:02.339058 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.339090 kubelet[2684]: E0527 03:20:02.339081 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.339365 kubelet[2684]: I0527 03:20:02.339104 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e359ce44-55b5-46e1-a044-9a6f462b1bb9-socket-dir\") pod \"csi-node-driver-86zzn\" (UID: \"e359ce44-55b5-46e1-a044-9a6f462b1bb9\") " pod="calico-system/csi-node-driver-86zzn" May 27 03:20:02.340867 kubelet[2684]: E0527 03:20:02.340842 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.340867 kubelet[2684]: W0527 03:20:02.340862 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.344507 kubelet[2684]: E0527 03:20:02.344467 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.346604 kubelet[2684]: E0527 03:20:02.346540 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.346604 kubelet[2684]: W0527 03:20:02.346573 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.347176 kubelet[2684]: E0527 03:20:02.347130 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.347604 kubelet[2684]: E0527 03:20:02.347319 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.347604 kubelet[2684]: W0527 03:20:02.347588 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.348628 kubelet[2684]: E0527 03:20:02.348567 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.350625 kubelet[2684]: E0527 03:20:02.350579 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.350625 kubelet[2684]: W0527 03:20:02.350619 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.350766 kubelet[2684]: E0527 03:20:02.350739 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.351128 kubelet[2684]: I0527 03:20:02.351100 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q6zd\" (UniqueName: \"kubernetes.io/projected/e359ce44-55b5-46e1-a044-9a6f462b1bb9-kube-api-access-4q6zd\") pod \"csi-node-driver-86zzn\" (UID: \"e359ce44-55b5-46e1-a044-9a6f462b1bb9\") " pod="calico-system/csi-node-driver-86zzn" May 27 03:20:02.351897 kubelet[2684]: E0527 03:20:02.351865 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.351897 kubelet[2684]: W0527 03:20:02.351893 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.352121 kubelet[2684]: E0527 03:20:02.351910 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.354947 kubelet[2684]: E0527 03:20:02.354915 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.354947 kubelet[2684]: W0527 03:20:02.354944 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.355031 kubelet[2684]: E0527 03:20:02.354981 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.355248 kubelet[2684]: E0527 03:20:02.355222 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.355248 kubelet[2684]: W0527 03:20:02.355243 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.355296 kubelet[2684]: E0527 03:20:02.355262 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.355491 kubelet[2684]: E0527 03:20:02.355467 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.355491 kubelet[2684]: W0527 03:20:02.355487 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.355552 kubelet[2684]: E0527 03:20:02.355498 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.355552 kubelet[2684]: I0527 03:20:02.355541 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e359ce44-55b5-46e1-a044-9a6f462b1bb9-registration-dir\") pod \"csi-node-driver-86zzn\" (UID: \"e359ce44-55b5-46e1-a044-9a6f462b1bb9\") " pod="calico-system/csi-node-driver-86zzn" May 27 03:20:02.356470 kubelet[2684]: E0527 03:20:02.355784 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.356470 kubelet[2684]: W0527 03:20:02.355800 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.356470 kubelet[2684]: E0527 03:20:02.355812 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.356470 kubelet[2684]: E0527 03:20:02.356000 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.356470 kubelet[2684]: W0527 03:20:02.356008 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.356470 kubelet[2684]: E0527 03:20:02.356018 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.384314 containerd[1564]: time="2025-05-27T03:20:02.384262939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kzpnz,Uid:381e90e6-b254-4672-9ef1-8abc6af35c34,Namespace:calico-system,Attempt:0,}" May 27 03:20:02.398507 systemd[1]: Started cri-containerd-43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6.scope - libcontainer container 43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6. May 27 03:20:02.456460 kubelet[2684]: E0527 03:20:02.456391 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.456460 kubelet[2684]: W0527 03:20:02.456414 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.457586 kubelet[2684]: E0527 03:20:02.456754 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.457895 kubelet[2684]: E0527 03:20:02.457713 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.457895 kubelet[2684]: W0527 03:20:02.457726 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.457895 kubelet[2684]: E0527 03:20:02.457752 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.458330 kubelet[2684]: E0527 03:20:02.458068 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.458330 kubelet[2684]: W0527 03:20:02.458081 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.458330 kubelet[2684]: E0527 03:20:02.458098 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.458548 kubelet[2684]: E0527 03:20:02.458483 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.458548 kubelet[2684]: W0527 03:20:02.458501 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.458691 kubelet[2684]: E0527 03:20:02.458637 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.458940 kubelet[2684]: E0527 03:20:02.458926 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.459007 kubelet[2684]: W0527 03:20:02.458995 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.459154 kubelet[2684]: E0527 03:20:02.459109 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.459269 kubelet[2684]: E0527 03:20:02.459248 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.459269 kubelet[2684]: W0527 03:20:02.459266 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.459365 kubelet[2684]: E0527 03:20:02.459302 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.459542 kubelet[2684]: E0527 03:20:02.459524 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.459542 kubelet[2684]: W0527 03:20:02.459539 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.459617 kubelet[2684]: E0527 03:20:02.459577 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.459776 kubelet[2684]: E0527 03:20:02.459760 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.459776 kubelet[2684]: W0527 03:20:02.459774 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.459851 kubelet[2684]: E0527 03:20:02.459789 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.460028 kubelet[2684]: E0527 03:20:02.460014 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.460028 kubelet[2684]: W0527 03:20:02.460025 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.460084 kubelet[2684]: E0527 03:20:02.460040 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.460264 kubelet[2684]: E0527 03:20:02.460248 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.460264 kubelet[2684]: W0527 03:20:02.460261 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.460345 kubelet[2684]: E0527 03:20:02.460276 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.460492 kubelet[2684]: E0527 03:20:02.460475 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.460492 kubelet[2684]: W0527 03:20:02.460488 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.460576 kubelet[2684]: E0527 03:20:02.460529 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.460679 kubelet[2684]: E0527 03:20:02.460662 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.460679 kubelet[2684]: W0527 03:20:02.460676 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.460768 kubelet[2684]: E0527 03:20:02.460743 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.460932 kubelet[2684]: E0527 03:20:02.460916 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.460964 kubelet[2684]: W0527 03:20:02.460930 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.460989 kubelet[2684]: E0527 03:20:02.460974 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.461194 kubelet[2684]: E0527 03:20:02.461111 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.461194 kubelet[2684]: W0527 03:20:02.461123 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.461194 kubelet[2684]: E0527 03:20:02.461154 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.461366 kubelet[2684]: E0527 03:20:02.461336 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.461366 kubelet[2684]: W0527 03:20:02.461362 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.461416 kubelet[2684]: E0527 03:20:02.461393 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.461625 kubelet[2684]: E0527 03:20:02.461609 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.461625 kubelet[2684]: W0527 03:20:02.461623 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.461688 kubelet[2684]: E0527 03:20:02.461638 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.461893 kubelet[2684]: E0527 03:20:02.461867 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.461893 kubelet[2684]: W0527 03:20:02.461884 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.461949 kubelet[2684]: E0527 03:20:02.461904 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.462187 kubelet[2684]: E0527 03:20:02.462172 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.462187 kubelet[2684]: W0527 03:20:02.462185 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.462248 kubelet[2684]: E0527 03:20:02.462202 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.462412 kubelet[2684]: E0527 03:20:02.462398 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.462412 kubelet[2684]: W0527 03:20:02.462409 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.462589 kubelet[2684]: E0527 03:20:02.462458 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.462653 kubelet[2684]: E0527 03:20:02.462629 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.462653 kubelet[2684]: W0527 03:20:02.462638 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.462739 kubelet[2684]: E0527 03:20:02.462660 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.463018 kubelet[2684]: E0527 03:20:02.462997 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.463066 kubelet[2684]: W0527 03:20:02.463047 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.463090 kubelet[2684]: E0527 03:20:02.463071 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.463318 kubelet[2684]: E0527 03:20:02.463303 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.463366 kubelet[2684]: W0527 03:20:02.463349 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.463412 kubelet[2684]: E0527 03:20:02.463379 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.463636 kubelet[2684]: E0527 03:20:02.463621 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.463636 kubelet[2684]: W0527 03:20:02.463635 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.463713 kubelet[2684]: E0527 03:20:02.463674 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.463873 kubelet[2684]: E0527 03:20:02.463856 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.463873 kubelet[2684]: W0527 03:20:02.463867 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.463971 kubelet[2684]: E0527 03:20:02.463877 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.470473 kubelet[2684]: E0527 03:20:02.470423 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.470544 kubelet[2684]: W0527 03:20:02.470484 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.470544 kubelet[2684]: E0527 03:20:02.470499 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.485128 kubelet[2684]: E0527 03:20:02.485103 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:02.485128 kubelet[2684]: W0527 03:20:02.485122 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:02.485233 kubelet[2684]: E0527 03:20:02.485134 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:02.496048 containerd[1564]: time="2025-05-27T03:20:02.495990300Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-657779c6d5-kdtjc,Uid:c6db46dd-4aa2-4602-9250-ae156d5ddb5a,Namespace:calico-system,Attempt:0,} returns sandbox id \"43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6\"" May 27 03:20:02.505383 containerd[1564]: time="2025-05-27T03:20:02.505335802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 03:20:02.508965 containerd[1564]: time="2025-05-27T03:20:02.508330201Z" level=info msg="connecting to shim 8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1" address="unix:///run/containerd/s/920982042a907e997fcbf87b47dd807d4f8767db6a5c8260a279bcdc66cd159d" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:02.539619 systemd[1]: Started cri-containerd-8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1.scope - libcontainer container 8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1. May 27 03:20:02.566904 containerd[1564]: time="2025-05-27T03:20:02.566857031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kzpnz,Uid:381e90e6-b254-4672-9ef1-8abc6af35c34,Namespace:calico-system,Attempt:0,} returns sandbox id \"8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1\"" May 27 03:20:04.629479 kubelet[2684]: E0527 03:20:04.629393 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86zzn" podUID="e359ce44-55b5-46e1-a044-9a6f462b1bb9" May 27 03:20:04.838214 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1789042578.mount: Deactivated successfully. May 27 03:20:05.163852 containerd[1564]: time="2025-05-27T03:20:05.163787926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:05.164683 containerd[1564]: time="2025-05-27T03:20:05.164628510Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 27 03:20:05.165932 containerd[1564]: time="2025-05-27T03:20:05.165897733Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:05.169174 containerd[1564]: time="2025-05-27T03:20:05.169129584Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:05.169728 containerd[1564]: time="2025-05-27T03:20:05.169689800Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 2.664313501s" May 27 03:20:05.169771 containerd[1564]: time="2025-05-27T03:20:05.169729113Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 27 03:20:05.170847 containerd[1564]: time="2025-05-27T03:20:05.170777189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 03:20:05.179762 containerd[1564]: time="2025-05-27T03:20:05.179700527Z" level=info msg="CreateContainer within sandbox \"43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 03:20:05.194364 containerd[1564]: time="2025-05-27T03:20:05.194299041Z" level=info msg="Container a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:05.202234 containerd[1564]: time="2025-05-27T03:20:05.202183772Z" level=info msg="CreateContainer within sandbox \"43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390\"" May 27 03:20:05.202818 containerd[1564]: time="2025-05-27T03:20:05.202760630Z" level=info msg="StartContainer for \"a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390\"" May 27 03:20:05.203836 containerd[1564]: time="2025-05-27T03:20:05.203788006Z" level=info msg="connecting to shim a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390" address="unix:///run/containerd/s/1f909b33cfaa4c0acd092b9e7c5d5b306d38098506339c222d2823439c3566f0" protocol=ttrpc version=3 May 27 03:20:05.227635 systemd[1]: Started cri-containerd-a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390.scope - libcontainer container a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390. May 27 03:20:05.283194 containerd[1564]: time="2025-05-27T03:20:05.283144141Z" level=info msg="StartContainer for \"a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390\" returns successfully" May 27 03:20:05.703020 kubelet[2684]: I0527 03:20:05.702421 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-657779c6d5-kdtjc" podStartSLOduration=2.036656348 podStartE2EDuration="4.702391769s" podCreationTimestamp="2025-05-27 03:20:01 +0000 UTC" firstStartedPulling="2025-05-27 03:20:02.504807496 +0000 UTC m=+17.962847723" lastFinishedPulling="2025-05-27 03:20:05.170542917 +0000 UTC m=+20.628583144" observedRunningTime="2025-05-27 03:20:05.702262466 +0000 UTC m=+21.160302713" watchObservedRunningTime="2025-05-27 03:20:05.702391769 +0000 UTC m=+21.160431996" May 27 03:20:05.755901 kubelet[2684]: E0527 03:20:05.755841 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.755901 kubelet[2684]: W0527 03:20:05.755874 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.755901 kubelet[2684]: E0527 03:20:05.755898 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.756162 kubelet[2684]: E0527 03:20:05.756148 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.756162 kubelet[2684]: W0527 03:20:05.756156 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.756162 kubelet[2684]: E0527 03:20:05.756164 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.756379 kubelet[2684]: E0527 03:20:05.756347 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.756379 kubelet[2684]: W0527 03:20:05.756359 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.756379 kubelet[2684]: E0527 03:20:05.756367 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.756652 kubelet[2684]: E0527 03:20:05.756621 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.756652 kubelet[2684]: W0527 03:20:05.756634 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.756652 kubelet[2684]: E0527 03:20:05.756643 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.756842 kubelet[2684]: E0527 03:20:05.756823 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.756842 kubelet[2684]: W0527 03:20:05.756834 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.756842 kubelet[2684]: E0527 03:20:05.756842 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.757018 kubelet[2684]: E0527 03:20:05.757001 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.757018 kubelet[2684]: W0527 03:20:05.757011 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.757018 kubelet[2684]: E0527 03:20:05.757019 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.757198 kubelet[2684]: E0527 03:20:05.757178 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.757198 kubelet[2684]: W0527 03:20:05.757189 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.757198 kubelet[2684]: E0527 03:20:05.757197 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.757389 kubelet[2684]: E0527 03:20:05.757371 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.757389 kubelet[2684]: W0527 03:20:05.757382 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.757389 kubelet[2684]: E0527 03:20:05.757390 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.757605 kubelet[2684]: E0527 03:20:05.757587 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.757605 kubelet[2684]: W0527 03:20:05.757598 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.757605 kubelet[2684]: E0527 03:20:05.757606 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.757785 kubelet[2684]: E0527 03:20:05.757768 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.757785 kubelet[2684]: W0527 03:20:05.757778 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.757785 kubelet[2684]: E0527 03:20:05.757786 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.757963 kubelet[2684]: E0527 03:20:05.757946 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.757963 kubelet[2684]: W0527 03:20:05.757957 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.758013 kubelet[2684]: E0527 03:20:05.757965 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.758150 kubelet[2684]: E0527 03:20:05.758133 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.758150 kubelet[2684]: W0527 03:20:05.758145 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.758201 kubelet[2684]: E0527 03:20:05.758153 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.758342 kubelet[2684]: E0527 03:20:05.758325 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.758342 kubelet[2684]: W0527 03:20:05.758336 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.758394 kubelet[2684]: E0527 03:20:05.758343 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.758562 kubelet[2684]: E0527 03:20:05.758543 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.758562 kubelet[2684]: W0527 03:20:05.758555 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.758562 kubelet[2684]: E0527 03:20:05.758570 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.758756 kubelet[2684]: E0527 03:20:05.758736 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.758756 kubelet[2684]: W0527 03:20:05.758747 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.758756 kubelet[2684]: E0527 03:20:05.758756 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.781342 kubelet[2684]: E0527 03:20:05.781307 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.781342 kubelet[2684]: W0527 03:20:05.781334 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.781454 kubelet[2684]: E0527 03:20:05.781359 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.781624 kubelet[2684]: E0527 03:20:05.781606 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.781624 kubelet[2684]: W0527 03:20:05.781619 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.781689 kubelet[2684]: E0527 03:20:05.781635 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.782088 kubelet[2684]: E0527 03:20:05.782061 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.782088 kubelet[2684]: W0527 03:20:05.782078 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.782158 kubelet[2684]: E0527 03:20:05.782095 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.782419 kubelet[2684]: E0527 03:20:05.782365 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.782419 kubelet[2684]: W0527 03:20:05.782397 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.782543 kubelet[2684]: E0527 03:20:05.782472 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.782815 kubelet[2684]: E0527 03:20:05.782758 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.782815 kubelet[2684]: W0527 03:20:05.782786 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.782815 kubelet[2684]: E0527 03:20:05.782805 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.783150 kubelet[2684]: E0527 03:20:05.783127 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.783150 kubelet[2684]: W0527 03:20:05.783146 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.783242 kubelet[2684]: E0527 03:20:05.783194 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.783487 kubelet[2684]: E0527 03:20:05.783451 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.783487 kubelet[2684]: W0527 03:20:05.783482 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.783590 kubelet[2684]: E0527 03:20:05.783530 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.783820 kubelet[2684]: E0527 03:20:05.783748 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.783820 kubelet[2684]: W0527 03:20:05.783763 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.783820 kubelet[2684]: E0527 03:20:05.783791 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.784045 kubelet[2684]: E0527 03:20:05.784027 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.784045 kubelet[2684]: W0527 03:20:05.784039 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.784089 kubelet[2684]: E0527 03:20:05.784054 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.784406 kubelet[2684]: E0527 03:20:05.784380 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.784406 kubelet[2684]: W0527 03:20:05.784402 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.784556 kubelet[2684]: E0527 03:20:05.784428 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.785010 kubelet[2684]: E0527 03:20:05.784908 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.785010 kubelet[2684]: W0527 03:20:05.784929 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.785010 kubelet[2684]: E0527 03:20:05.784953 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.785324 kubelet[2684]: E0527 03:20:05.785295 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.785324 kubelet[2684]: W0527 03:20:05.785308 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.785324 kubelet[2684]: E0527 03:20:05.785327 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.785627 kubelet[2684]: E0527 03:20:05.785547 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.785627 kubelet[2684]: W0527 03:20:05.785555 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.785627 kubelet[2684]: E0527 03:20:05.785598 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.785847 kubelet[2684]: E0527 03:20:05.785829 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.785847 kubelet[2684]: W0527 03:20:05.785843 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.785915 kubelet[2684]: E0527 03:20:05.785876 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.786205 kubelet[2684]: E0527 03:20:05.786176 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.786282 kubelet[2684]: W0527 03:20:05.786196 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.786339 kubelet[2684]: E0527 03:20:05.786310 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.786607 kubelet[2684]: E0527 03:20:05.786583 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.786607 kubelet[2684]: W0527 03:20:05.786598 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.786668 kubelet[2684]: E0527 03:20:05.786615 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.786849 kubelet[2684]: E0527 03:20:05.786827 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.786849 kubelet[2684]: W0527 03:20:05.786840 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.786849 kubelet[2684]: E0527 03:20:05.786849 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:05.787215 kubelet[2684]: E0527 03:20:05.787193 2684 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:20:05.787215 kubelet[2684]: W0527 03:20:05.787206 2684 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:20:05.787215 kubelet[2684]: E0527 03:20:05.787215 2684 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:20:06.407458 containerd[1564]: time="2025-05-27T03:20:06.407375124Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:06.408412 containerd[1564]: time="2025-05-27T03:20:06.408373175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 27 03:20:06.409982 containerd[1564]: time="2025-05-27T03:20:06.409927644Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:06.412487 containerd[1564]: time="2025-05-27T03:20:06.412452551Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:06.413420 containerd[1564]: time="2025-05-27T03:20:06.413387032Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.242575818s" May 27 03:20:06.413582 containerd[1564]: time="2025-05-27T03:20:06.413423440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 27 03:20:06.415957 containerd[1564]: time="2025-05-27T03:20:06.415660455Z" level=info msg="CreateContainer within sandbox \"8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 03:20:06.426902 containerd[1564]: time="2025-05-27T03:20:06.426825732Z" level=info msg="Container 87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:06.437117 containerd[1564]: time="2025-05-27T03:20:06.437064965Z" level=info msg="CreateContainer within sandbox \"8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9\"" May 27 03:20:06.437733 containerd[1564]: time="2025-05-27T03:20:06.437710561Z" level=info msg="StartContainer for \"87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9\"" May 27 03:20:06.439858 containerd[1564]: time="2025-05-27T03:20:06.439827249Z" level=info msg="connecting to shim 87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9" address="unix:///run/containerd/s/920982042a907e997fcbf87b47dd807d4f8767db6a5c8260a279bcdc66cd159d" protocol=ttrpc version=3 May 27 03:20:06.471763 systemd[1]: Started cri-containerd-87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9.scope - libcontainer container 87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9. May 27 03:20:06.527227 containerd[1564]: time="2025-05-27T03:20:06.527180167Z" level=info msg="StartContainer for \"87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9\" returns successfully" May 27 03:20:06.537597 systemd[1]: cri-containerd-87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9.scope: Deactivated successfully. May 27 03:20:06.541090 containerd[1564]: time="2025-05-27T03:20:06.540998012Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9\" id:\"87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9\" pid:3378 exited_at:{seconds:1748316006 nanos:540206430}" May 27 03:20:06.541090 containerd[1564]: time="2025-05-27T03:20:06.541061281Z" level=info msg="received exit event container_id:\"87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9\" id:\"87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9\" pid:3378 exited_at:{seconds:1748316006 nanos:540206430}" May 27 03:20:06.568891 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9-rootfs.mount: Deactivated successfully. May 27 03:20:06.629707 kubelet[2684]: E0527 03:20:06.629637 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86zzn" podUID="e359ce44-55b5-46e1-a044-9a6f462b1bb9" May 27 03:20:06.695600 kubelet[2684]: I0527 03:20:06.695405 2684 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:20:07.700136 containerd[1564]: time="2025-05-27T03:20:07.699848191Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 03:20:08.632099 kubelet[2684]: E0527 03:20:08.632050 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86zzn" podUID="e359ce44-55b5-46e1-a044-9a6f462b1bb9" May 27 03:20:10.630280 kubelet[2684]: E0527 03:20:10.629748 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-86zzn" podUID="e359ce44-55b5-46e1-a044-9a6f462b1bb9" May 27 03:20:11.164062 containerd[1564]: time="2025-05-27T03:20:11.164004791Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:11.164866 containerd[1564]: time="2025-05-27T03:20:11.164812191Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 27 03:20:11.165953 containerd[1564]: time="2025-05-27T03:20:11.165906340Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:11.167603 containerd[1564]: time="2025-05-27T03:20:11.167568077Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:11.168461 containerd[1564]: time="2025-05-27T03:20:11.168403258Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 3.468508069s" May 27 03:20:11.168529 containerd[1564]: time="2025-05-27T03:20:11.168463151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 27 03:20:11.170651 containerd[1564]: time="2025-05-27T03:20:11.170605061Z" level=info msg="CreateContainer within sandbox \"8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 03:20:11.180645 containerd[1564]: time="2025-05-27T03:20:11.180570585Z" level=info msg="Container c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:11.189894 containerd[1564]: time="2025-05-27T03:20:11.189844216Z" level=info msg="CreateContainer within sandbox \"8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db\"" May 27 03:20:11.190380 containerd[1564]: time="2025-05-27T03:20:11.190335100Z" level=info msg="StartContainer for \"c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db\"" May 27 03:20:11.191765 containerd[1564]: time="2025-05-27T03:20:11.191727641Z" level=info msg="connecting to shim c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db" address="unix:///run/containerd/s/920982042a907e997fcbf87b47dd807d4f8767db6a5c8260a279bcdc66cd159d" protocol=ttrpc version=3 May 27 03:20:11.221579 systemd[1]: Started cri-containerd-c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db.scope - libcontainer container c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db. May 27 03:20:11.270070 containerd[1564]: time="2025-05-27T03:20:11.270022839Z" level=info msg="StartContainer for \"c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db\" returns successfully" May 27 03:20:12.426322 systemd[1]: cri-containerd-c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db.scope: Deactivated successfully. May 27 03:20:12.426796 systemd[1]: cri-containerd-c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db.scope: Consumed 566ms CPU time, 177.9M memory peak, 3.2M read from disk, 170.9M written to disk. May 27 03:20:12.427706 containerd[1564]: time="2025-05-27T03:20:12.427656114Z" level=info msg="received exit event container_id:\"c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db\" id:\"c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db\" pid:3437 exited_at:{seconds:1748316012 nanos:427406483}" May 27 03:20:12.428079 containerd[1564]: time="2025-05-27T03:20:12.427911033Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db\" id:\"c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db\" pid:3437 exited_at:{seconds:1748316012 nanos:427406483}" May 27 03:20:12.450938 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db-rootfs.mount: Deactivated successfully. May 27 03:20:12.469262 kubelet[2684]: I0527 03:20:12.469221 2684 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 03:20:12.508382 systemd[1]: Created slice kubepods-burstable-poda53c2898_7fdc_4905_9181_77e2226d0ebf.slice - libcontainer container kubepods-burstable-poda53c2898_7fdc_4905_9181_77e2226d0ebf.slice. May 27 03:20:12.516903 systemd[1]: Created slice kubepods-besteffort-pode0243602_5836_4db5_a6a6_65bbd87369a9.slice - libcontainer container kubepods-besteffort-pode0243602_5836_4db5_a6a6_65bbd87369a9.slice. May 27 03:20:12.523631 systemd[1]: Created slice kubepods-besteffort-podc801dbfb_f6db_41ad_af94_26a4c142cc32.slice - libcontainer container kubepods-besteffort-podc801dbfb_f6db_41ad_af94_26a4c142cc32.slice. May 27 03:20:12.528648 kubelet[2684]: I0527 03:20:12.528571 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vvgx\" (UniqueName: \"kubernetes.io/projected/d9e15705-91b0-4fa0-9d68-b56dc4cb8520-kube-api-access-2vvgx\") pod \"coredns-668d6bf9bc-44gpm\" (UID: \"d9e15705-91b0-4fa0-9d68-b56dc4cb8520\") " pod="kube-system/coredns-668d6bf9bc-44gpm" May 27 03:20:12.528648 kubelet[2684]: I0527 03:20:12.528620 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvbxf\" (UniqueName: \"kubernetes.io/projected/a53c2898-7fdc-4905-9181-77e2226d0ebf-kube-api-access-xvbxf\") pod \"coredns-668d6bf9bc-d7ml8\" (UID: \"a53c2898-7fdc-4905-9181-77e2226d0ebf\") " pod="kube-system/coredns-668d6bf9bc-d7ml8" May 27 03:20:12.528648 kubelet[2684]: I0527 03:20:12.528645 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e15705-91b0-4fa0-9d68-b56dc4cb8520-config-volume\") pod \"coredns-668d6bf9bc-44gpm\" (UID: \"d9e15705-91b0-4fa0-9d68-b56dc4cb8520\") " pod="kube-system/coredns-668d6bf9bc-44gpm" May 27 03:20:12.528788 kubelet[2684]: I0527 03:20:12.528667 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a883863a-d79b-4a80-911d-ab857d7d891b-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-nkjnr\" (UID: \"a883863a-d79b-4a80-911d-ab857d7d891b\") " pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:12.528788 kubelet[2684]: I0527 03:20:12.528688 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a53c2898-7fdc-4905-9181-77e2226d0ebf-config-volume\") pod \"coredns-668d6bf9bc-d7ml8\" (UID: \"a53c2898-7fdc-4905-9181-77e2226d0ebf\") " pod="kube-system/coredns-668d6bf9bc-d7ml8" May 27 03:20:12.528788 kubelet[2684]: I0527 03:20:12.528709 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa-calico-apiserver-certs\") pod \"calico-apiserver-bfbf6b5b6-hk6ml\" (UID: \"6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa\") " pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" May 27 03:20:12.528788 kubelet[2684]: I0527 03:20:12.528731 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d851576d-45af-4a97-8273-c3a263d1e859-calico-apiserver-certs\") pod \"calico-apiserver-bfbf6b5b6-7ws6p\" (UID: \"d851576d-45af-4a97-8273-c3a263d1e859\") " pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" May 27 03:20:12.528788 kubelet[2684]: I0527 03:20:12.528755 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a883863a-d79b-4a80-911d-ab857d7d891b-config\") pod \"goldmane-78d55f7ddc-nkjnr\" (UID: \"a883863a-d79b-4a80-911d-ab857d7d891b\") " pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:12.528908 kubelet[2684]: I0527 03:20:12.528788 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sf2l\" (UniqueName: \"kubernetes.io/projected/a883863a-d79b-4a80-911d-ab857d7d891b-kube-api-access-7sf2l\") pod \"goldmane-78d55f7ddc-nkjnr\" (UID: \"a883863a-d79b-4a80-911d-ab857d7d891b\") " pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:12.528908 kubelet[2684]: I0527 03:20:12.528812 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c801dbfb-f6db-41ad-af94-26a4c142cc32-tigera-ca-bundle\") pod \"calico-kube-controllers-77c7f8d9d8-w26p7\" (UID: \"c801dbfb-f6db-41ad-af94-26a4c142cc32\") " pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" May 27 03:20:12.528908 kubelet[2684]: I0527 03:20:12.528835 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a883863a-d79b-4a80-911d-ab857d7d891b-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-nkjnr\" (UID: \"a883863a-d79b-4a80-911d-ab857d7d891b\") " pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:12.528908 kubelet[2684]: I0527 03:20:12.528862 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tfp\" (UniqueName: \"kubernetes.io/projected/c801dbfb-f6db-41ad-af94-26a4c142cc32-kube-api-access-s4tfp\") pod \"calico-kube-controllers-77c7f8d9d8-w26p7\" (UID: \"c801dbfb-f6db-41ad-af94-26a4c142cc32\") " pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" May 27 03:20:12.528908 kubelet[2684]: I0527 03:20:12.528886 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-backend-key-pair\") pod \"whisker-755c6d7f4d-6s7ww\" (UID: \"e0243602-5836-4db5-a6a6-65bbd87369a9\") " pod="calico-system/whisker-755c6d7f4d-6s7ww" May 27 03:20:12.529027 kubelet[2684]: I0527 03:20:12.528920 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-ca-bundle\") pod \"whisker-755c6d7f4d-6s7ww\" (UID: \"e0243602-5836-4db5-a6a6-65bbd87369a9\") " pod="calico-system/whisker-755c6d7f4d-6s7ww" May 27 03:20:12.529027 kubelet[2684]: I0527 03:20:12.528941 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc964\" (UniqueName: \"kubernetes.io/projected/e0243602-5836-4db5-a6a6-65bbd87369a9-kube-api-access-dc964\") pod \"whisker-755c6d7f4d-6s7ww\" (UID: \"e0243602-5836-4db5-a6a6-65bbd87369a9\") " pod="calico-system/whisker-755c6d7f4d-6s7ww" May 27 03:20:12.529027 kubelet[2684]: I0527 03:20:12.528964 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwnj\" (UniqueName: \"kubernetes.io/projected/d851576d-45af-4a97-8273-c3a263d1e859-kube-api-access-kdwnj\") pod \"calico-apiserver-bfbf6b5b6-7ws6p\" (UID: \"d851576d-45af-4a97-8273-c3a263d1e859\") " pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" May 27 03:20:12.529027 kubelet[2684]: I0527 03:20:12.528996 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnpz9\" (UniqueName: \"kubernetes.io/projected/6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa-kube-api-access-mnpz9\") pod \"calico-apiserver-bfbf6b5b6-hk6ml\" (UID: \"6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa\") " pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" May 27 03:20:12.530533 systemd[1]: Created slice kubepods-besteffort-poda883863a_d79b_4a80_911d_ab857d7d891b.slice - libcontainer container kubepods-besteffort-poda883863a_d79b_4a80_911d_ab857d7d891b.slice. May 27 03:20:12.535573 systemd[1]: Created slice kubepods-burstable-podd9e15705_91b0_4fa0_9d68_b56dc4cb8520.slice - libcontainer container kubepods-burstable-podd9e15705_91b0_4fa0_9d68_b56dc4cb8520.slice. May 27 03:20:12.540425 systemd[1]: Created slice kubepods-besteffort-podd851576d_45af_4a97_8273_c3a263d1e859.slice - libcontainer container kubepods-besteffort-podd851576d_45af_4a97_8273_c3a263d1e859.slice. May 27 03:20:12.546652 systemd[1]: Created slice kubepods-besteffort-pod6faf2a5b_9ea6_460d_9973_8f32eb2ae9fa.slice - libcontainer container kubepods-besteffort-pod6faf2a5b_9ea6_460d_9973_8f32eb2ae9fa.slice. May 27 03:20:12.656868 systemd[1]: Created slice kubepods-besteffort-pode359ce44_55b5_46e1_a044_9a6f462b1bb9.slice - libcontainer container kubepods-besteffort-pode359ce44_55b5_46e1_a044_9a6f462b1bb9.slice. May 27 03:20:12.659226 containerd[1564]: time="2025-05-27T03:20:12.659174835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86zzn,Uid:e359ce44-55b5-46e1-a044-9a6f462b1bb9,Namespace:calico-system,Attempt:0,}" May 27 03:20:12.813039 containerd[1564]: time="2025-05-27T03:20:12.812894702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d7ml8,Uid:a53c2898-7fdc-4905-9181-77e2226d0ebf,Namespace:kube-system,Attempt:0,}" May 27 03:20:12.821047 containerd[1564]: time="2025-05-27T03:20:12.820987237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-755c6d7f4d-6s7ww,Uid:e0243602-5836-4db5-a6a6-65bbd87369a9,Namespace:calico-system,Attempt:0,}" May 27 03:20:12.827865 containerd[1564]: time="2025-05-27T03:20:12.827824382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7f8d9d8-w26p7,Uid:c801dbfb-f6db-41ad-af94-26a4c142cc32,Namespace:calico-system,Attempt:0,}" May 27 03:20:12.833905 containerd[1564]: time="2025-05-27T03:20:12.833835651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-nkjnr,Uid:a883863a-d79b-4a80-911d-ab857d7d891b,Namespace:calico-system,Attempt:0,}" May 27 03:20:12.838775 containerd[1564]: time="2025-05-27T03:20:12.838663264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-44gpm,Uid:d9e15705-91b0-4fa0-9d68-b56dc4cb8520,Namespace:kube-system,Attempt:0,}" May 27 03:20:12.844372 containerd[1564]: time="2025-05-27T03:20:12.844320628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-7ws6p,Uid:d851576d-45af-4a97-8273-c3a263d1e859,Namespace:calico-apiserver,Attempt:0,}" May 27 03:20:12.851187 containerd[1564]: time="2025-05-27T03:20:12.851144387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-hk6ml,Uid:6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa,Namespace:calico-apiserver,Attempt:0,}" May 27 03:20:13.052982 kubelet[2684]: I0527 03:20:13.052929 2684 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:20:13.512352 containerd[1564]: time="2025-05-27T03:20:13.512293685Z" level=error msg="Failed to destroy network for sandbox \"f20655bb334015ed20058201da884a2b63fb0a1ac35240ffe55ae833968967ea\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.514823 containerd[1564]: time="2025-05-27T03:20:13.514755336Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-44gpm,Uid:d9e15705-91b0-4fa0-9d68-b56dc4cb8520,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20655bb334015ed20058201da884a2b63fb0a1ac35240ffe55ae833968967ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.517005 systemd[1]: run-netns-cni\x2d529bf4d7\x2d018b\x2d1ba6\x2d617f\x2d54e45691525e.mount: Deactivated successfully. May 27 03:20:13.521235 containerd[1564]: time="2025-05-27T03:20:13.521142530Z" level=error msg="Failed to destroy network for sandbox \"60d8a9cec98080d39502719eac159fd31497702c5b118a751c0e50bc2fb14b5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.525976 systemd[1]: run-netns-cni\x2dd89eb1cd\x2d45e1\x2d8c01\x2dda98\x2d46169a47af84.mount: Deactivated successfully. May 27 03:20:13.530915 containerd[1564]: time="2025-05-27T03:20:13.530825536Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-755c6d7f4d-6s7ww,Uid:e0243602-5836-4db5-a6a6-65bbd87369a9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"60d8a9cec98080d39502719eac159fd31497702c5b118a751c0e50bc2fb14b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.533412 kubelet[2684]: E0527 03:20:13.533360 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20655bb334015ed20058201da884a2b63fb0a1ac35240ffe55ae833968967ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.533890 kubelet[2684]: E0527 03:20:13.533473 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20655bb334015ed20058201da884a2b63fb0a1ac35240ffe55ae833968967ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-44gpm" May 27 03:20:13.533890 kubelet[2684]: E0527 03:20:13.533497 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f20655bb334015ed20058201da884a2b63fb0a1ac35240ffe55ae833968967ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-44gpm" May 27 03:20:13.533890 kubelet[2684]: E0527 03:20:13.533547 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-44gpm_kube-system(d9e15705-91b0-4fa0-9d68-b56dc4cb8520)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-44gpm_kube-system(d9e15705-91b0-4fa0-9d68-b56dc4cb8520)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f20655bb334015ed20058201da884a2b63fb0a1ac35240ffe55ae833968967ea\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-44gpm" podUID="d9e15705-91b0-4fa0-9d68-b56dc4cb8520" May 27 03:20:13.534005 containerd[1564]: time="2025-05-27T03:20:13.533500647Z" level=error msg="Failed to destroy network for sandbox \"8d03d532e10f552fe3591014f1d755fe294a08e361a231ade28000fa94f0a482\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.534048 kubelet[2684]: E0527 03:20:13.533586 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60d8a9cec98080d39502719eac159fd31497702c5b118a751c0e50bc2fb14b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.534048 kubelet[2684]: E0527 03:20:13.533609 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60d8a9cec98080d39502719eac159fd31497702c5b118a751c0e50bc2fb14b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-755c6d7f4d-6s7ww" May 27 03:20:13.534048 kubelet[2684]: E0527 03:20:13.533622 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60d8a9cec98080d39502719eac159fd31497702c5b118a751c0e50bc2fb14b5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-755c6d7f4d-6s7ww" May 27 03:20:13.534542 kubelet[2684]: E0527 03:20:13.533646 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-755c6d7f4d-6s7ww_calico-system(e0243602-5836-4db5-a6a6-65bbd87369a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-755c6d7f4d-6s7ww_calico-system(e0243602-5836-4db5-a6a6-65bbd87369a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60d8a9cec98080d39502719eac159fd31497702c5b118a751c0e50bc2fb14b5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-755c6d7f4d-6s7ww" podUID="e0243602-5836-4db5-a6a6-65bbd87369a9" May 27 03:20:13.536098 containerd[1564]: time="2025-05-27T03:20:13.536064480Z" level=error msg="Failed to destroy network for sandbox \"eb953fa5077f89016a076a3f3fed772f2ab9c0836c7861c70b87f5ef2cdb3962\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.536709 systemd[1]: run-netns-cni\x2df192d7ba\x2d4f0d\x2d5796\x2d4e1e\x2d1703608768d1.mount: Deactivated successfully. May 27 03:20:13.539735 containerd[1564]: time="2025-05-27T03:20:13.539571497Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d7ml8,Uid:a53c2898-7fdc-4905-9181-77e2226d0ebf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d03d532e10f552fe3591014f1d755fe294a08e361a231ade28000fa94f0a482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.541244 kubelet[2684]: E0527 03:20:13.540672 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d03d532e10f552fe3591014f1d755fe294a08e361a231ade28000fa94f0a482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.541244 kubelet[2684]: E0527 03:20:13.540734 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d03d532e10f552fe3591014f1d755fe294a08e361a231ade28000fa94f0a482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-d7ml8" May 27 03:20:13.541244 kubelet[2684]: E0527 03:20:13.540756 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d03d532e10f552fe3591014f1d755fe294a08e361a231ade28000fa94f0a482\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-d7ml8" May 27 03:20:13.541050 systemd[1]: run-netns-cni\x2da5ad7048\x2d7962\x2d173f\x2dc143\x2d710e0394a9a3.mount: Deactivated successfully. May 27 03:20:13.541410 kubelet[2684]: E0527 03:20:13.540795 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-d7ml8_kube-system(a53c2898-7fdc-4905-9181-77e2226d0ebf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-d7ml8_kube-system(a53c2898-7fdc-4905-9181-77e2226d0ebf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d03d532e10f552fe3591014f1d755fe294a08e361a231ade28000fa94f0a482\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-d7ml8" podUID="a53c2898-7fdc-4905-9181-77e2226d0ebf" May 27 03:20:13.547337 containerd[1564]: time="2025-05-27T03:20:13.547188816Z" level=error msg="Failed to destroy network for sandbox \"6b9e5d0e6a0141f3f15bc97f92f0f39953d172316a287b3261401f6debf99dd3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.548491 containerd[1564]: time="2025-05-27T03:20:13.548158039Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-7ws6p,Uid:d851576d-45af-4a97-8273-c3a263d1e859,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb953fa5077f89016a076a3f3fed772f2ab9c0836c7861c70b87f5ef2cdb3962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.548828 kubelet[2684]: E0527 03:20:13.548472 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb953fa5077f89016a076a3f3fed772f2ab9c0836c7861c70b87f5ef2cdb3962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.548828 kubelet[2684]: E0527 03:20:13.548523 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb953fa5077f89016a076a3f3fed772f2ab9c0836c7861c70b87f5ef2cdb3962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" May 27 03:20:13.548828 kubelet[2684]: E0527 03:20:13.548572 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb953fa5077f89016a076a3f3fed772f2ab9c0836c7861c70b87f5ef2cdb3962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" May 27 03:20:13.548925 kubelet[2684]: E0527 03:20:13.548633 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bfbf6b5b6-7ws6p_calico-apiserver(d851576d-45af-4a97-8273-c3a263d1e859)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bfbf6b5b6-7ws6p_calico-apiserver(d851576d-45af-4a97-8273-c3a263d1e859)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb953fa5077f89016a076a3f3fed772f2ab9c0836c7861c70b87f5ef2cdb3962\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" podUID="d851576d-45af-4a97-8273-c3a263d1e859" May 27 03:20:13.551169 containerd[1564]: time="2025-05-27T03:20:13.551124119Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-hk6ml,Uid:6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b9e5d0e6a0141f3f15bc97f92f0f39953d172316a287b3261401f6debf99dd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.551379 kubelet[2684]: E0527 03:20:13.551307 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b9e5d0e6a0141f3f15bc97f92f0f39953d172316a287b3261401f6debf99dd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.551416 kubelet[2684]: E0527 03:20:13.551384 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b9e5d0e6a0141f3f15bc97f92f0f39953d172316a287b3261401f6debf99dd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" May 27 03:20:13.551416 kubelet[2684]: E0527 03:20:13.551401 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b9e5d0e6a0141f3f15bc97f92f0f39953d172316a287b3261401f6debf99dd3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" May 27 03:20:13.551537 kubelet[2684]: E0527 03:20:13.551428 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bfbf6b5b6-hk6ml_calico-apiserver(6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bfbf6b5b6-hk6ml_calico-apiserver(6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b9e5d0e6a0141f3f15bc97f92f0f39953d172316a287b3261401f6debf99dd3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" podUID="6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa" May 27 03:20:13.563564 containerd[1564]: time="2025-05-27T03:20:13.563491113Z" level=error msg="Failed to destroy network for sandbox \"734697b10d742e17ba7db2f1ba357e40d811ff5b7b95ee0e278c91fde119c964\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.563996 containerd[1564]: time="2025-05-27T03:20:13.563951939Z" level=error msg="Failed to destroy network for sandbox \"7003f8c8ac419e4332d95625f66274b88ebff60f33ae7f14ad99c482382c87bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.564958 containerd[1564]: time="2025-05-27T03:20:13.564886988Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7f8d9d8-w26p7,Uid:c801dbfb-f6db-41ad-af94-26a4c142cc32,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"734697b10d742e17ba7db2f1ba357e40d811ff5b7b95ee0e278c91fde119c964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.565217 kubelet[2684]: E0527 03:20:13.565178 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"734697b10d742e17ba7db2f1ba357e40d811ff5b7b95ee0e278c91fde119c964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.565284 kubelet[2684]: E0527 03:20:13.565235 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"734697b10d742e17ba7db2f1ba357e40d811ff5b7b95ee0e278c91fde119c964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" May 27 03:20:13.565284 kubelet[2684]: E0527 03:20:13.565257 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"734697b10d742e17ba7db2f1ba357e40d811ff5b7b95ee0e278c91fde119c964\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" May 27 03:20:13.565375 kubelet[2684]: E0527 03:20:13.565310 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-77c7f8d9d8-w26p7_calico-system(c801dbfb-f6db-41ad-af94-26a4c142cc32)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-77c7f8d9d8-w26p7_calico-system(c801dbfb-f6db-41ad-af94-26a4c142cc32)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"734697b10d742e17ba7db2f1ba357e40d811ff5b7b95ee0e278c91fde119c964\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" podUID="c801dbfb-f6db-41ad-af94-26a4c142cc32" May 27 03:20:13.565929 containerd[1564]: time="2025-05-27T03:20:13.565855650Z" level=error msg="Failed to destroy network for sandbox \"24140b79e4dc97f99179f2d10f90e8e1aaf5d888a69aa4e0eadf1d2a10ccfc3f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.565929 containerd[1564]: time="2025-05-27T03:20:13.565902619Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86zzn,Uid:e359ce44-55b5-46e1-a044-9a6f462b1bb9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7003f8c8ac419e4332d95625f66274b88ebff60f33ae7f14ad99c482382c87bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.566234 kubelet[2684]: E0527 03:20:13.566154 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7003f8c8ac419e4332d95625f66274b88ebff60f33ae7f14ad99c482382c87bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.566281 kubelet[2684]: E0527 03:20:13.566243 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7003f8c8ac419e4332d95625f66274b88ebff60f33ae7f14ad99c482382c87bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-86zzn" May 27 03:20:13.566281 kubelet[2684]: E0527 03:20:13.566267 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7003f8c8ac419e4332d95625f66274b88ebff60f33ae7f14ad99c482382c87bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-86zzn" May 27 03:20:13.566351 kubelet[2684]: E0527 03:20:13.566317 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-86zzn_calico-system(e359ce44-55b5-46e1-a044-9a6f462b1bb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-86zzn_calico-system(e359ce44-55b5-46e1-a044-9a6f462b1bb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7003f8c8ac419e4332d95625f66274b88ebff60f33ae7f14ad99c482382c87bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-86zzn" podUID="e359ce44-55b5-46e1-a044-9a6f462b1bb9" May 27 03:20:13.567631 containerd[1564]: time="2025-05-27T03:20:13.567590063Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-nkjnr,Uid:a883863a-d79b-4a80-911d-ab857d7d891b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"24140b79e4dc97f99179f2d10f90e8e1aaf5d888a69aa4e0eadf1d2a10ccfc3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.567799 kubelet[2684]: E0527 03:20:13.567771 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24140b79e4dc97f99179f2d10f90e8e1aaf5d888a69aa4e0eadf1d2a10ccfc3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:13.567862 kubelet[2684]: E0527 03:20:13.567834 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24140b79e4dc97f99179f2d10f90e8e1aaf5d888a69aa4e0eadf1d2a10ccfc3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:13.567902 kubelet[2684]: E0527 03:20:13.567859 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24140b79e4dc97f99179f2d10f90e8e1aaf5d888a69aa4e0eadf1d2a10ccfc3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:13.567935 kubelet[2684]: E0527 03:20:13.567919 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"24140b79e4dc97f99179f2d10f90e8e1aaf5d888a69aa4e0eadf1d2a10ccfc3f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:20:13.715456 containerd[1564]: time="2025-05-27T03:20:13.715401140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 03:20:14.451394 systemd[1]: run-netns-cni\x2ddf99a35f\x2daacc\x2db4ca\x2d3b7b\x2ddf976709899d.mount: Deactivated successfully. May 27 03:20:14.451592 systemd[1]: run-netns-cni\x2d37849379\x2d150d\x2dbd3d\x2d9a9f\x2dace35e9c4d03.mount: Deactivated successfully. May 27 03:20:14.451699 systemd[1]: run-netns-cni\x2d02abed4b\x2d6e99\x2dbd83\x2db5f4\x2db77618ede155.mount: Deactivated successfully. May 27 03:20:14.451808 systemd[1]: run-netns-cni\x2d73add387\x2d5cab\x2d9c7d\x2dff67\x2d1dd618685e63.mount: Deactivated successfully. May 27 03:20:22.432998 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2774945239.mount: Deactivated successfully. May 27 03:20:25.755464 kubelet[2684]: E0527 03:20:25.755051 2684 kubelet.go:2573] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.125s" May 27 03:20:25.761069 containerd[1564]: time="2025-05-27T03:20:25.760896094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7f8d9d8-w26p7,Uid:c801dbfb-f6db-41ad-af94-26a4c142cc32,Namespace:calico-system,Attempt:0,}" May 27 03:20:25.826615 containerd[1564]: time="2025-05-27T03:20:25.761872778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-44gpm,Uid:d9e15705-91b0-4fa0-9d68-b56dc4cb8520,Namespace:kube-system,Attempt:0,}" May 27 03:20:25.826615 containerd[1564]: time="2025-05-27T03:20:25.762548537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86zzn,Uid:e359ce44-55b5-46e1-a044-9a6f462b1bb9,Namespace:calico-system,Attempt:0,}" May 27 03:20:25.826615 containerd[1564]: time="2025-05-27T03:20:25.762669715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-hk6ml,Uid:6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa,Namespace:calico-apiserver,Attempt:0,}" May 27 03:20:25.826615 containerd[1564]: time="2025-05-27T03:20:25.762842088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-nkjnr,Uid:a883863a-d79b-4a80-911d-ab857d7d891b,Namespace:calico-system,Attempt:0,}" May 27 03:20:25.826615 containerd[1564]: time="2025-05-27T03:20:25.763491538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-7ws6p,Uid:d851576d-45af-4a97-8273-c3a263d1e859,Namespace:calico-apiserver,Attempt:0,}" May 27 03:20:26.115420 containerd[1564]: time="2025-05-27T03:20:26.099561979Z" level=error msg="Failed to destroy network for sandbox \"035a253ce71589f62a3900e78e3d20bd7f50003ecaa59c8efb2553f1cf003a81\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.115572 containerd[1564]: time="2025-05-27T03:20:26.111097503Z" level=error msg="Failed to destroy network for sandbox \"45e5730e17debd49a8171e61adb3a8fdb9d3f842cfee4ce29c62d21369161078\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.126960 containerd[1564]: time="2025-05-27T03:20:26.126868041Z" level=error msg="Failed to destroy network for sandbox \"0733e2f521095132109ca2274794ca423f421b3260b70b43007673d07d9da44d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.190005 containerd[1564]: time="2025-05-27T03:20:26.189945353Z" level=error msg="Failed to destroy network for sandbox \"dd8c980a5d4890651c506ca032721156b18912dd003277a39747c0a722849b7f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.299024 containerd[1564]: time="2025-05-27T03:20:26.298965225Z" level=error msg="Failed to destroy network for sandbox \"7a97bd508df4647afc3d577075ab062f1b9bd04991b400c593ee159a10d04483\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.422776 containerd[1564]: time="2025-05-27T03:20:26.422723439Z" level=error msg="Failed to destroy network for sandbox \"0cd57ddb3e336e98fdeb46e15a75255fa6a9b78cdacade30f71004684cc83ad2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.447183 containerd[1564]: time="2025-05-27T03:20:26.447129965Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:26.491277 containerd[1564]: time="2025-05-27T03:20:26.491149426Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-44gpm,Uid:d9e15705-91b0-4fa0-9d68-b56dc4cb8520,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e5730e17debd49a8171e61adb3a8fdb9d3f842cfee4ce29c62d21369161078\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.491760 kubelet[2684]: E0527 03:20:26.491470 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e5730e17debd49a8171e61adb3a8fdb9d3f842cfee4ce29c62d21369161078\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.491760 kubelet[2684]: E0527 03:20:26.491545 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e5730e17debd49a8171e61adb3a8fdb9d3f842cfee4ce29c62d21369161078\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-44gpm" May 27 03:20:26.491760 kubelet[2684]: E0527 03:20:26.491571 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e5730e17debd49a8171e61adb3a8fdb9d3f842cfee4ce29c62d21369161078\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-44gpm" May 27 03:20:26.491877 kubelet[2684]: E0527 03:20:26.491628 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-44gpm_kube-system(d9e15705-91b0-4fa0-9d68-b56dc4cb8520)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-44gpm_kube-system(d9e15705-91b0-4fa0-9d68-b56dc4cb8520)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"45e5730e17debd49a8171e61adb3a8fdb9d3f842cfee4ce29c62d21369161078\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-44gpm" podUID="d9e15705-91b0-4fa0-9d68-b56dc4cb8520" May 27 03:20:26.511959 containerd[1564]: time="2025-05-27T03:20:26.511873587Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7f8d9d8-w26p7,Uid:c801dbfb-f6db-41ad-af94-26a4c142cc32,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"035a253ce71589f62a3900e78e3d20bd7f50003ecaa59c8efb2553f1cf003a81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.512203 kubelet[2684]: E0527 03:20:26.512154 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"035a253ce71589f62a3900e78e3d20bd7f50003ecaa59c8efb2553f1cf003a81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.512300 kubelet[2684]: E0527 03:20:26.512211 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"035a253ce71589f62a3900e78e3d20bd7f50003ecaa59c8efb2553f1cf003a81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" May 27 03:20:26.512300 kubelet[2684]: E0527 03:20:26.512231 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"035a253ce71589f62a3900e78e3d20bd7f50003ecaa59c8efb2553f1cf003a81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" May 27 03:20:26.512391 kubelet[2684]: E0527 03:20:26.512293 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-77c7f8d9d8-w26p7_calico-system(c801dbfb-f6db-41ad-af94-26a4c142cc32)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-77c7f8d9d8-w26p7_calico-system(c801dbfb-f6db-41ad-af94-26a4c142cc32)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"035a253ce71589f62a3900e78e3d20bd7f50003ecaa59c8efb2553f1cf003a81\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" podUID="c801dbfb-f6db-41ad-af94-26a4c142cc32" May 27 03:20:26.516102 containerd[1564]: time="2025-05-27T03:20:26.516047525Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86zzn,Uid:e359ce44-55b5-46e1-a044-9a6f462b1bb9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0733e2f521095132109ca2274794ca423f421b3260b70b43007673d07d9da44d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.516376 kubelet[2684]: E0527 03:20:26.516324 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0733e2f521095132109ca2274794ca423f421b3260b70b43007673d07d9da44d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.516376 kubelet[2684]: E0527 03:20:26.516362 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0733e2f521095132109ca2274794ca423f421b3260b70b43007673d07d9da44d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-86zzn" May 27 03:20:26.516509 kubelet[2684]: E0527 03:20:26.516380 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0733e2f521095132109ca2274794ca423f421b3260b70b43007673d07d9da44d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-86zzn" May 27 03:20:26.516509 kubelet[2684]: E0527 03:20:26.516411 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-86zzn_calico-system(e359ce44-55b5-46e1-a044-9a6f462b1bb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-86zzn_calico-system(e359ce44-55b5-46e1-a044-9a6f462b1bb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0733e2f521095132109ca2274794ca423f421b3260b70b43007673d07d9da44d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-86zzn" podUID="e359ce44-55b5-46e1-a044-9a6f462b1bb9" May 27 03:20:26.519766 containerd[1564]: time="2025-05-27T03:20:26.519701808Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-hk6ml,Uid:6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd8c980a5d4890651c506ca032721156b18912dd003277a39747c0a722849b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.520036 kubelet[2684]: E0527 03:20:26.519997 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd8c980a5d4890651c506ca032721156b18912dd003277a39747c0a722849b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.520218 kubelet[2684]: E0527 03:20:26.520050 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd8c980a5d4890651c506ca032721156b18912dd003277a39747c0a722849b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" May 27 03:20:26.520218 kubelet[2684]: E0527 03:20:26.520077 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd8c980a5d4890651c506ca032721156b18912dd003277a39747c0a722849b7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" May 27 03:20:26.520218 kubelet[2684]: E0527 03:20:26.520126 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bfbf6b5b6-hk6ml_calico-apiserver(6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bfbf6b5b6-hk6ml_calico-apiserver(6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd8c980a5d4890651c506ca032721156b18912dd003277a39747c0a722849b7f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" podUID="6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa" May 27 03:20:26.521024 containerd[1564]: time="2025-05-27T03:20:26.520937339Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-nkjnr,Uid:a883863a-d79b-4a80-911d-ab857d7d891b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a97bd508df4647afc3d577075ab062f1b9bd04991b400c593ee159a10d04483\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.521233 kubelet[2684]: E0527 03:20:26.521164 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a97bd508df4647afc3d577075ab062f1b9bd04991b400c593ee159a10d04483\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.521304 kubelet[2684]: E0527 03:20:26.521274 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a97bd508df4647afc3d577075ab062f1b9bd04991b400c593ee159a10d04483\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:26.521387 kubelet[2684]: E0527 03:20:26.521309 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a97bd508df4647afc3d577075ab062f1b9bd04991b400c593ee159a10d04483\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-nkjnr" May 27 03:20:26.521473 kubelet[2684]: E0527 03:20:26.521377 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a97bd508df4647afc3d577075ab062f1b9bd04991b400c593ee159a10d04483\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:20:26.522108 containerd[1564]: time="2025-05-27T03:20:26.522043746Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-7ws6p,Uid:d851576d-45af-4a97-8273-c3a263d1e859,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd57ddb3e336e98fdeb46e15a75255fa6a9b78cdacade30f71004684cc83ad2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.522361 kubelet[2684]: E0527 03:20:26.522324 2684 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd57ddb3e336e98fdeb46e15a75255fa6a9b78cdacade30f71004684cc83ad2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:20:26.522411 kubelet[2684]: E0527 03:20:26.522369 2684 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd57ddb3e336e98fdeb46e15a75255fa6a9b78cdacade30f71004684cc83ad2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" May 27 03:20:26.522466 kubelet[2684]: E0527 03:20:26.522426 2684 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd57ddb3e336e98fdeb46e15a75255fa6a9b78cdacade30f71004684cc83ad2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" May 27 03:20:26.522514 kubelet[2684]: E0527 03:20:26.522491 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bfbf6b5b6-7ws6p_calico-apiserver(d851576d-45af-4a97-8273-c3a263d1e859)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bfbf6b5b6-7ws6p_calico-apiserver(d851576d-45af-4a97-8273-c3a263d1e859)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0cd57ddb3e336e98fdeb46e15a75255fa6a9b78cdacade30f71004684cc83ad2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" podUID="d851576d-45af-4a97-8273-c3a263d1e859" May 27 03:20:26.523683 containerd[1564]: time="2025-05-27T03:20:26.523623844Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 27 03:20:26.526045 containerd[1564]: time="2025-05-27T03:20:26.525990168Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:26.530007 containerd[1564]: time="2025-05-27T03:20:26.529972006Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:26.530637 containerd[1564]: time="2025-05-27T03:20:26.530576932Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 12.814892399s" May 27 03:20:26.530637 containerd[1564]: time="2025-05-27T03:20:26.530616878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 27 03:20:26.547788 containerd[1564]: time="2025-05-27T03:20:26.547718815Z" level=info msg="CreateContainer within sandbox \"8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 03:20:26.560809 containerd[1564]: time="2025-05-27T03:20:26.560750570Z" level=info msg="Container 09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:26.577989 containerd[1564]: time="2025-05-27T03:20:26.577914123Z" level=info msg="CreateContainer within sandbox \"8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\"" May 27 03:20:26.589340 containerd[1564]: time="2025-05-27T03:20:26.589264059Z" level=info msg="StartContainer for \"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\"" May 27 03:20:26.591165 containerd[1564]: time="2025-05-27T03:20:26.591127728Z" level=info msg="connecting to shim 09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe" address="unix:///run/containerd/s/920982042a907e997fcbf87b47dd807d4f8767db6a5c8260a279bcdc66cd159d" protocol=ttrpc version=3 May 27 03:20:26.624756 systemd[1]: Started cri-containerd-09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe.scope - libcontainer container 09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe. May 27 03:20:26.833471 containerd[1564]: time="2025-05-27T03:20:26.833333049Z" level=info msg="StartContainer for \"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" returns successfully" May 27 03:20:26.844197 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 03:20:26.844803 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 03:20:26.991181 systemd[1]: run-netns-cni\x2dd954c481\x2d9ace\x2d31c4\x2daf79\x2de18ab62f80bf.mount: Deactivated successfully. May 27 03:20:26.991299 systemd[1]: run-netns-cni\x2dca4ed774\x2df96a\x2d2884\x2d1b7a\x2dd5dbd9454d5a.mount: Deactivated successfully. May 27 03:20:26.991371 systemd[1]: run-netns-cni\x2d90be6ee8\x2d17c8\x2de606\x2daa5d\x2d0eea09af5e8f.mount: Deactivated successfully. May 27 03:20:26.991455 systemd[1]: run-netns-cni\x2dda8732a3\x2d788d\x2d03f1\x2d0c1f\x2d6a71a9d7e789.mount: Deactivated successfully. May 27 03:20:26.991525 systemd[1]: run-netns-cni\x2d2495670a\x2d1433\x2da1f0\x2d56df\x2d892e9dd57638.mount: Deactivated successfully. May 27 03:20:26.991593 systemd[1]: run-netns-cni\x2d1098220d\x2d2754\x2db6b6\x2d600e\x2dac59066ad59a.mount: Deactivated successfully. May 27 03:20:27.263775 kubelet[2684]: I0527 03:20:27.263498 2684 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-backend-key-pair\") pod \"e0243602-5836-4db5-a6a6-65bbd87369a9\" (UID: \"e0243602-5836-4db5-a6a6-65bbd87369a9\") " May 27 03:20:27.263775 kubelet[2684]: I0527 03:20:27.263551 2684 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-ca-bundle\") pod \"e0243602-5836-4db5-a6a6-65bbd87369a9\" (UID: \"e0243602-5836-4db5-a6a6-65bbd87369a9\") " May 27 03:20:27.263775 kubelet[2684]: I0527 03:20:27.263583 2684 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc964\" (UniqueName: \"kubernetes.io/projected/e0243602-5836-4db5-a6a6-65bbd87369a9-kube-api-access-dc964\") pod \"e0243602-5836-4db5-a6a6-65bbd87369a9\" (UID: \"e0243602-5836-4db5-a6a6-65bbd87369a9\") " May 27 03:20:27.266860 kubelet[2684]: I0527 03:20:27.266821 2684 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e0243602-5836-4db5-a6a6-65bbd87369a9" (UID: "e0243602-5836-4db5-a6a6-65bbd87369a9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 03:20:27.272300 kubelet[2684]: I0527 03:20:27.272247 2684 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0243602-5836-4db5-a6a6-65bbd87369a9-kube-api-access-dc964" (OuterVolumeSpecName: "kube-api-access-dc964") pod "e0243602-5836-4db5-a6a6-65bbd87369a9" (UID: "e0243602-5836-4db5-a6a6-65bbd87369a9"). InnerVolumeSpecName "kube-api-access-dc964". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 03:20:27.273552 systemd[1]: var-lib-kubelet-pods-e0243602\x2d5836\x2d4db5\x2da6a6\x2d65bbd87369a9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddc964.mount: Deactivated successfully. May 27 03:20:27.273682 systemd[1]: var-lib-kubelet-pods-e0243602\x2d5836\x2d4db5\x2da6a6\x2d65bbd87369a9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 03:20:27.274690 kubelet[2684]: I0527 03:20:27.274633 2684 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e0243602-5836-4db5-a6a6-65bbd87369a9" (UID: "e0243602-5836-4db5-a6a6-65bbd87369a9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 03:20:27.364803 kubelet[2684]: I0527 03:20:27.364749 2684 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" May 27 03:20:27.364803 kubelet[2684]: I0527 03:20:27.364791 2684 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dc964\" (UniqueName: \"kubernetes.io/projected/e0243602-5836-4db5-a6a6-65bbd87369a9-kube-api-access-dc964\") on node \"localhost\" DevicePath \"\"" May 27 03:20:27.364803 kubelet[2684]: I0527 03:20:27.364805 2684 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e0243602-5836-4db5-a6a6-65bbd87369a9-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" May 27 03:20:27.631113 containerd[1564]: time="2025-05-27T03:20:27.630942180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d7ml8,Uid:a53c2898-7fdc-4905-9181-77e2226d0ebf,Namespace:kube-system,Attempt:0,}" May 27 03:20:27.825380 systemd[1]: Removed slice kubepods-besteffort-pode0243602_5836_4db5_a6a6_65bbd87369a9.slice - libcontainer container kubepods-besteffort-pode0243602_5836_4db5_a6a6_65bbd87369a9.slice. May 27 03:20:27.880209 systemd[1]: Started sshd@7-10.0.0.98:22-10.0.0.1:48410.service - OpenSSH per-connection server daemon (10.0.0.1:48410). May 27 03:20:27.888191 kubelet[2684]: I0527 03:20:27.888003 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-kzpnz" podStartSLOduration=1.924498166 podStartE2EDuration="25.887986101s" podCreationTimestamp="2025-05-27 03:20:02 +0000 UTC" firstStartedPulling="2025-05-27 03:20:02.568052316 +0000 UTC m=+18.026092544" lastFinishedPulling="2025-05-27 03:20:26.531540252 +0000 UTC m=+41.989580479" observedRunningTime="2025-05-27 03:20:27.88661201 +0000 UTC m=+43.344652237" watchObservedRunningTime="2025-05-27 03:20:27.887986101 +0000 UTC m=+43.346026318" May 27 03:20:27.998638 systemd[1]: Created slice kubepods-besteffort-pod2c1ae830_ec2d_4322_b9d4_cc52915b6648.slice - libcontainer container kubepods-besteffort-pod2c1ae830_ec2d_4322_b9d4_cc52915b6648.slice. May 27 03:20:28.016543 sshd[4002]: Accepted publickey for core from 10.0.0.1 port 48410 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:28.018551 sshd-session[4002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:28.026958 systemd-logind[1549]: New session 8 of user core. May 27 03:20:28.037644 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 03:20:28.071471 kubelet[2684]: I0527 03:20:28.071298 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c1ae830-ec2d-4322-b9d4-cc52915b6648-whisker-ca-bundle\") pod \"whisker-556d57579d-8sdgb\" (UID: \"2c1ae830-ec2d-4322-b9d4-cc52915b6648\") " pod="calico-system/whisker-556d57579d-8sdgb" May 27 03:20:28.071471 kubelet[2684]: I0527 03:20:28.071368 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdkgp\" (UniqueName: \"kubernetes.io/projected/2c1ae830-ec2d-4322-b9d4-cc52915b6648-kube-api-access-kdkgp\") pod \"whisker-556d57579d-8sdgb\" (UID: \"2c1ae830-ec2d-4322-b9d4-cc52915b6648\") " pod="calico-system/whisker-556d57579d-8sdgb" May 27 03:20:28.071471 kubelet[2684]: I0527 03:20:28.071395 2684 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2c1ae830-ec2d-4322-b9d4-cc52915b6648-whisker-backend-key-pair\") pod \"whisker-556d57579d-8sdgb\" (UID: \"2c1ae830-ec2d-4322-b9d4-cc52915b6648\") " pod="calico-system/whisker-556d57579d-8sdgb" May 27 03:20:28.106665 containerd[1564]: time="2025-05-27T03:20:28.106598727Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"0566254ce97d4b560062bdeeca37ace404eb35a6c221d7ba041e050c5e3df346\" pid:4030 exit_status:1 exited_at:{seconds:1748316028 nanos:105259963}" May 27 03:20:28.180547 systemd-networkd[1475]: cali80c72ad7e11: Link UP May 27 03:20:28.181089 systemd-networkd[1475]: cali80c72ad7e11: Gained carrier May 27 03:20:28.205455 containerd[1564]: 2025-05-27 03:20:28.021 [INFO][4009] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 03:20:28.205455 containerd[1564]: 2025-05-27 03:20:28.050 [INFO][4009] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0 coredns-668d6bf9bc- kube-system a53c2898-7fdc-4905-9181-77e2226d0ebf 782 0 2025-05-27 03:19:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-d7ml8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali80c72ad7e11 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-" May 27 03:20:28.205455 containerd[1564]: 2025-05-27 03:20:28.050 [INFO][4009] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" May 27 03:20:28.205455 containerd[1564]: 2025-05-27 03:20:28.126 [INFO][4051] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" HandleID="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Workload="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.127 [INFO][4051] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" HandleID="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Workload="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a5510), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-d7ml8", "timestamp":"2025-05-27 03:20:28.126305983 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.127 [INFO][4051] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.127 [INFO][4051] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.127 [INFO][4051] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.135 [INFO][4051] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" host="localhost" May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.144 [INFO][4051] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.149 [INFO][4051] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.151 [INFO][4051] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.153 [INFO][4051] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:28.205663 containerd[1564]: 2025-05-27 03:20:28.153 [INFO][4051] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" host="localhost" May 27 03:20:28.205978 containerd[1564]: 2025-05-27 03:20:28.155 [INFO][4051] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a May 27 03:20:28.205978 containerd[1564]: 2025-05-27 03:20:28.158 [INFO][4051] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" host="localhost" May 27 03:20:28.205978 containerd[1564]: 2025-05-27 03:20:28.164 [INFO][4051] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" host="localhost" May 27 03:20:28.205978 containerd[1564]: 2025-05-27 03:20:28.164 [INFO][4051] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" host="localhost" May 27 03:20:28.205978 containerd[1564]: 2025-05-27 03:20:28.164 [INFO][4051] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:28.205978 containerd[1564]: 2025-05-27 03:20:28.164 [INFO][4051] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" HandleID="k8s-pod-network.2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Workload="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" May 27 03:20:28.206108 containerd[1564]: 2025-05-27 03:20:28.167 [INFO][4009] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a53c2898-7fdc-4905-9181-77e2226d0ebf", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-d7ml8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali80c72ad7e11", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:28.206189 containerd[1564]: 2025-05-27 03:20:28.167 [INFO][4009] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" May 27 03:20:28.206189 containerd[1564]: 2025-05-27 03:20:28.167 [INFO][4009] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali80c72ad7e11 ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" May 27 03:20:28.206189 containerd[1564]: 2025-05-27 03:20:28.182 [INFO][4009] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" May 27 03:20:28.206270 containerd[1564]: 2025-05-27 03:20:28.183 [INFO][4009] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a53c2898-7fdc-4905-9181-77e2226d0ebf", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a", Pod:"coredns-668d6bf9bc-d7ml8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali80c72ad7e11", MAC:"62:c5:17:a2:d9:7f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:28.206270 containerd[1564]: 2025-05-27 03:20:28.198 [INFO][4009] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" Namespace="kube-system" Pod="coredns-668d6bf9bc-d7ml8" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--d7ml8-eth0" May 27 03:20:28.215456 sshd[4041]: Connection closed by 10.0.0.1 port 48410 May 27 03:20:28.215864 sshd-session[4002]: pam_unix(sshd:session): session closed for user core May 27 03:20:28.220717 systemd[1]: sshd@7-10.0.0.98:22-10.0.0.1:48410.service: Deactivated successfully. May 27 03:20:28.223333 systemd[1]: session-8.scope: Deactivated successfully. May 27 03:20:28.225717 systemd-logind[1549]: Session 8 logged out. Waiting for processes to exit. May 27 03:20:28.230242 systemd-logind[1549]: Removed session 8. May 27 03:20:28.296068 containerd[1564]: time="2025-05-27T03:20:28.295996160Z" level=info msg="connecting to shim 2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a" address="unix:///run/containerd/s/e00b2bfe2b532f18fc04435d552093588d7dc76d73230479a03456943adf83d2" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:28.305321 containerd[1564]: time="2025-05-27T03:20:28.305248243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-556d57579d-8sdgb,Uid:2c1ae830-ec2d-4322-b9d4-cc52915b6648,Namespace:calico-system,Attempt:0,}" May 27 03:20:28.322632 systemd[1]: Started cri-containerd-2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a.scope - libcontainer container 2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a. May 27 03:20:28.344281 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:28.387158 containerd[1564]: time="2025-05-27T03:20:28.387104579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-d7ml8,Uid:a53c2898-7fdc-4905-9181-77e2226d0ebf,Namespace:kube-system,Attempt:0,} returns sandbox id \"2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a\"" May 27 03:20:28.390987 containerd[1564]: time="2025-05-27T03:20:28.390887292Z" level=info msg="CreateContainer within sandbox \"2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 03:20:28.412786 systemd-networkd[1475]: cali9d36f9ded70: Link UP May 27 03:20:28.413282 systemd-networkd[1475]: cali9d36f9ded70: Gained carrier May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.328 [INFO][4115] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.340 [INFO][4115] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--556d57579d--8sdgb-eth0 whisker-556d57579d- calico-system 2c1ae830-ec2d-4322-b9d4-cc52915b6648 903 0 2025-05-27 03:20:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:556d57579d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-556d57579d-8sdgb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9d36f9ded70 [] [] }} ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.340 [INFO][4115] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-eth0" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.371 [INFO][4142] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" HandleID="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Workload="localhost-k8s-whisker--556d57579d--8sdgb-eth0" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.371 [INFO][4142] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" HandleID="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Workload="localhost-k8s-whisker--556d57579d--8sdgb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad320), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-556d57579d-8sdgb", "timestamp":"2025-05-27 03:20:28.371520676 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.371 [INFO][4142] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.371 [INFO][4142] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.371 [INFO][4142] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.377 [INFO][4142] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.383 [INFO][4142] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.388 [INFO][4142] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.391 [INFO][4142] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.395 [INFO][4142] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.395 [INFO][4142] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.396 [INFO][4142] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801 May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.401 [INFO][4142] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.406 [INFO][4142] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.406 [INFO][4142] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" host="localhost" May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.406 [INFO][4142] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:28.427180 containerd[1564]: 2025-05-27 03:20:28.406 [INFO][4142] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" HandleID="k8s-pod-network.1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Workload="localhost-k8s-whisker--556d57579d--8sdgb-eth0" May 27 03:20:28.427830 containerd[1564]: 2025-05-27 03:20:28.410 [INFO][4115] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--556d57579d--8sdgb-eth0", GenerateName:"whisker-556d57579d-", Namespace:"calico-system", SelfLink:"", UID:"2c1ae830-ec2d-4322-b9d4-cc52915b6648", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"556d57579d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-556d57579d-8sdgb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9d36f9ded70", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:28.427830 containerd[1564]: 2025-05-27 03:20:28.410 [INFO][4115] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-eth0" May 27 03:20:28.427830 containerd[1564]: 2025-05-27 03:20:28.410 [INFO][4115] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9d36f9ded70 ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-eth0" May 27 03:20:28.427830 containerd[1564]: 2025-05-27 03:20:28.413 [INFO][4115] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-eth0" May 27 03:20:28.427830 containerd[1564]: 2025-05-27 03:20:28.413 [INFO][4115] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--556d57579d--8sdgb-eth0", GenerateName:"whisker-556d57579d-", Namespace:"calico-system", SelfLink:"", UID:"2c1ae830-ec2d-4322-b9d4-cc52915b6648", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"556d57579d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801", Pod:"whisker-556d57579d-8sdgb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9d36f9ded70", MAC:"0a:85:7d:a4:97:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:28.427830 containerd[1564]: 2025-05-27 03:20:28.423 [INFO][4115] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" Namespace="calico-system" Pod="whisker-556d57579d-8sdgb" WorkloadEndpoint="localhost-k8s-whisker--556d57579d--8sdgb-eth0" May 27 03:20:28.430621 containerd[1564]: time="2025-05-27T03:20:28.430574074Z" level=info msg="Container ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:28.438790 containerd[1564]: time="2025-05-27T03:20:28.438677231Z" level=info msg="CreateContainer within sandbox \"2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25\"" May 27 03:20:28.439267 containerd[1564]: time="2025-05-27T03:20:28.439245578Z" level=info msg="StartContainer for \"ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25\"" May 27 03:20:28.440391 containerd[1564]: time="2025-05-27T03:20:28.440319193Z" level=info msg="connecting to shim ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25" address="unix:///run/containerd/s/e00b2bfe2b532f18fc04435d552093588d7dc76d73230479a03456943adf83d2" protocol=ttrpc version=3 May 27 03:20:28.473698 systemd[1]: Started cri-containerd-ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25.scope - libcontainer container ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25. May 27 03:20:28.477147 containerd[1564]: time="2025-05-27T03:20:28.477021191Z" level=info msg="connecting to shim 1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801" address="unix:///run/containerd/s/bc30f2077a89406e6ddb5b7b79a0da56dc527ad81a3ad9b6097abd7ddd1c4b17" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:28.506607 systemd[1]: Started cri-containerd-1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801.scope - libcontainer container 1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801. May 27 03:20:28.516486 containerd[1564]: time="2025-05-27T03:20:28.516422818Z" level=info msg="StartContainer for \"ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25\" returns successfully" May 27 03:20:28.521496 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:28.558853 containerd[1564]: time="2025-05-27T03:20:28.558784152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-556d57579d-8sdgb,Uid:2c1ae830-ec2d-4322-b9d4-cc52915b6648,Namespace:calico-system,Attempt:0,} returns sandbox id \"1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801\"" May 27 03:20:28.562375 containerd[1564]: time="2025-05-27T03:20:28.562322867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:20:28.632704 kubelet[2684]: I0527 03:20:28.632641 2684 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0243602-5836-4db5-a6a6-65bbd87369a9" path="/var/lib/kubelet/pods/e0243602-5836-4db5-a6a6-65bbd87369a9/volumes" May 27 03:20:28.800535 containerd[1564]: time="2025-05-27T03:20:28.800367547Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:20:28.801557 containerd[1564]: time="2025-05-27T03:20:28.801498000Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:20:28.801625 containerd[1564]: time="2025-05-27T03:20:28.801566148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:20:28.801900 kubelet[2684]: E0527 03:20:28.801846 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:20:28.801996 kubelet[2684]: E0527 03:20:28.801922 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:20:28.808263 kubelet[2684]: E0527 03:20:28.808166 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:767452f2f0924610a3259c1e7259b93b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:20:28.811286 containerd[1564]: time="2025-05-27T03:20:28.810680812Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:20:28.836193 kubelet[2684]: I0527 03:20:28.835840 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-d7ml8" podStartSLOduration=39.835820399 podStartE2EDuration="39.835820399s" podCreationTimestamp="2025-05-27 03:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:20:28.835219179 +0000 UTC m=+44.293259426" watchObservedRunningTime="2025-05-27 03:20:28.835820399 +0000 UTC m=+44.293860626" May 27 03:20:28.918128 containerd[1564]: time="2025-05-27T03:20:28.918069172Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"a7a2c22757e2e1cf02d6b8e5ae0a19a2a5c201cffc1cf27bbbfe1a28474ee1b8\" pid:4259 exit_status:1 exited_at:{seconds:1748316028 nanos:917751705}" May 27 03:20:29.037297 containerd[1564]: time="2025-05-27T03:20:29.037197291Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:20:29.077580 containerd[1564]: time="2025-05-27T03:20:29.077400247Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:20:29.077580 containerd[1564]: time="2025-05-27T03:20:29.077463296Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:20:29.077802 kubelet[2684]: E0527 03:20:29.077715 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:20:29.077802 kubelet[2684]: E0527 03:20:29.077770 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:20:29.077943 kubelet[2684]: E0527 03:20:29.077890 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:20:29.079113 kubelet[2684]: E0527 03:20:29.079071 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:20:29.516639 systemd-networkd[1475]: cali9d36f9ded70: Gained IPv6LL May 27 03:20:29.587649 systemd-networkd[1475]: vxlan.calico: Link UP May 27 03:20:29.587659 systemd-networkd[1475]: vxlan.calico: Gained carrier May 27 03:20:29.827694 kubelet[2684]: E0527 03:20:29.827068 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:20:29.900644 systemd-networkd[1475]: cali80c72ad7e11: Gained IPv6LL May 27 03:20:31.628631 systemd-networkd[1475]: vxlan.calico: Gained IPv6LL May 27 03:20:33.231180 systemd[1]: Started sshd@8-10.0.0.98:22-10.0.0.1:48422.service - OpenSSH per-connection server daemon (10.0.0.1:48422). May 27 03:20:33.297700 sshd[4485]: Accepted publickey for core from 10.0.0.1 port 48422 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:33.300075 sshd-session[4485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:33.305739 systemd-logind[1549]: New session 9 of user core. May 27 03:20:33.314761 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 03:20:33.462342 sshd[4487]: Connection closed by 10.0.0.1 port 48422 May 27 03:20:33.462711 sshd-session[4485]: pam_unix(sshd:session): session closed for user core May 27 03:20:33.466810 systemd[1]: sshd@8-10.0.0.98:22-10.0.0.1:48422.service: Deactivated successfully. May 27 03:20:33.468809 systemd[1]: session-9.scope: Deactivated successfully. May 27 03:20:33.469701 systemd-logind[1549]: Session 9 logged out. Waiting for processes to exit. May 27 03:20:33.470960 systemd-logind[1549]: Removed session 9. May 27 03:20:36.630593 containerd[1564]: time="2025-05-27T03:20:36.630507610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-7ws6p,Uid:d851576d-45af-4a97-8273-c3a263d1e859,Namespace:calico-apiserver,Attempt:0,}" May 27 03:20:37.002851 systemd-networkd[1475]: cali5ba59023dbd: Link UP May 27 03:20:37.003882 systemd-networkd[1475]: cali5ba59023dbd: Gained carrier May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.845 [INFO][4509] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0 calico-apiserver-bfbf6b5b6- calico-apiserver d851576d-45af-4a97-8273-c3a263d1e859 795 0 2025-05-27 03:19:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:bfbf6b5b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-bfbf6b5b6-7ws6p eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5ba59023dbd [] [] }} ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.845 [INFO][4509] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.868 [INFO][4525] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" HandleID="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Workload="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.868 [INFO][4525] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" HandleID="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Workload="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00023d230), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-bfbf6b5b6-7ws6p", "timestamp":"2025-05-27 03:20:36.868227986 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.868 [INFO][4525] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.868 [INFO][4525] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.868 [INFO][4525] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.875 [INFO][4525] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.879 [INFO][4525] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.882 [INFO][4525] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.883 [INFO][4525] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.885 [INFO][4525] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.885 [INFO][4525] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.886 [INFO][4525] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.919 [INFO][4525] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.997 [INFO][4525] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.997 [INFO][4525] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" host="localhost" May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.997 [INFO][4525] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:37.064228 containerd[1564]: 2025-05-27 03:20:36.997 [INFO][4525] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" HandleID="k8s-pod-network.287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Workload="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" May 27 03:20:37.064886 containerd[1564]: 2025-05-27 03:20:37.000 [INFO][4509] cni-plugin/k8s.go 418: Populated endpoint ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0", GenerateName:"calico-apiserver-bfbf6b5b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"d851576d-45af-4a97-8273-c3a263d1e859", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bfbf6b5b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-bfbf6b5b6-7ws6p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5ba59023dbd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:37.064886 containerd[1564]: 2025-05-27 03:20:37.000 [INFO][4509] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" May 27 03:20:37.064886 containerd[1564]: 2025-05-27 03:20:37.000 [INFO][4509] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ba59023dbd ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" May 27 03:20:37.064886 containerd[1564]: 2025-05-27 03:20:37.004 [INFO][4509] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" May 27 03:20:37.064886 containerd[1564]: 2025-05-27 03:20:37.004 [INFO][4509] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0", GenerateName:"calico-apiserver-bfbf6b5b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"d851576d-45af-4a97-8273-c3a263d1e859", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bfbf6b5b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea", Pod:"calico-apiserver-bfbf6b5b6-7ws6p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5ba59023dbd", MAC:"9a:f0:0b:f9:c0:2f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:37.064886 containerd[1564]: 2025-05-27 03:20:37.061 [INFO][4509] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-7ws6p" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--7ws6p-eth0" May 27 03:20:37.098143 containerd[1564]: time="2025-05-27T03:20:37.097482253Z" level=info msg="connecting to shim 287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea" address="unix:///run/containerd/s/1bba10e10a4caee62fca7b421b3b91a8fa58ba9c684558091aa6f3c19e57d860" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:37.169704 systemd[1]: Started cri-containerd-287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea.scope - libcontainer container 287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea. May 27 03:20:37.185339 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:37.275761 containerd[1564]: time="2025-05-27T03:20:37.275602741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-7ws6p,Uid:d851576d-45af-4a97-8273-c3a263d1e859,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea\"" May 27 03:20:37.277767 containerd[1564]: time="2025-05-27T03:20:37.277693645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 03:20:37.630799 containerd[1564]: time="2025-05-27T03:20:37.630645275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-nkjnr,Uid:a883863a-d79b-4a80-911d-ab857d7d891b,Namespace:calico-system,Attempt:0,}" May 27 03:20:38.133267 systemd-networkd[1475]: califad99b8ac90: Link UP May 27 03:20:38.133502 systemd-networkd[1475]: califad99b8ac90: Gained carrier May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:37.978 [INFO][4590] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0 goldmane-78d55f7ddc- calico-system a883863a-d79b-4a80-911d-ab857d7d891b 790 0 2025-05-27 03:20:01 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-78d55f7ddc-nkjnr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califad99b8ac90 [] [] }} ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:37.978 [INFO][4590] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.006 [INFO][4604] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" HandleID="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Workload="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.007 [INFO][4604] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" HandleID="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Workload="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001357d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-78d55f7ddc-nkjnr", "timestamp":"2025-05-27 03:20:38.006922795 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.007 [INFO][4604] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.007 [INFO][4604] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.007 [INFO][4604] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.014 [INFO][4604] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.019 [INFO][4604] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.024 [INFO][4604] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.025 [INFO][4604] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.028 [INFO][4604] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.028 [INFO][4604] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.029 [INFO][4604] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4 May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.041 [INFO][4604] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.127 [INFO][4604] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.127 [INFO][4604] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" host="localhost" May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.127 [INFO][4604] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:38.202576 containerd[1564]: 2025-05-27 03:20:38.127 [INFO][4604] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" HandleID="k8s-pod-network.3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Workload="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" May 27 03:20:38.203254 containerd[1564]: 2025-05-27 03:20:38.130 [INFO][4590] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"a883863a-d79b-4a80-911d-ab857d7d891b", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-78d55f7ddc-nkjnr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califad99b8ac90", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:38.203254 containerd[1564]: 2025-05-27 03:20:38.130 [INFO][4590] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" May 27 03:20:38.203254 containerd[1564]: 2025-05-27 03:20:38.130 [INFO][4590] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califad99b8ac90 ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" May 27 03:20:38.203254 containerd[1564]: 2025-05-27 03:20:38.134 [INFO][4590] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" May 27 03:20:38.203254 containerd[1564]: 2025-05-27 03:20:38.134 [INFO][4590] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"a883863a-d79b-4a80-911d-ab857d7d891b", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4", Pod:"goldmane-78d55f7ddc-nkjnr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califad99b8ac90", MAC:"26:fd:f7:e3:86:d1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:38.203254 containerd[1564]: 2025-05-27 03:20:38.199 [INFO][4590] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-nkjnr" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--nkjnr-eth0" May 27 03:20:38.485963 systemd[1]: Started sshd@9-10.0.0.98:22-10.0.0.1:38536.service - OpenSSH per-connection server daemon (10.0.0.1:38536). May 27 03:20:38.576650 sshd[4625]: Accepted publickey for core from 10.0.0.1 port 38536 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:38.578635 sshd-session[4625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:38.583484 systemd-logind[1549]: New session 10 of user core. May 27 03:20:38.593585 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 03:20:38.630691 containerd[1564]: time="2025-05-27T03:20:38.630634899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-hk6ml,Uid:6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa,Namespace:calico-apiserver,Attempt:0,}" May 27 03:20:38.631210 containerd[1564]: time="2025-05-27T03:20:38.631014702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86zzn,Uid:e359ce44-55b5-46e1-a044-9a6f462b1bb9,Namespace:calico-system,Attempt:0,}" May 27 03:20:38.727168 containerd[1564]: time="2025-05-27T03:20:38.726828704Z" level=info msg="connecting to shim 3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4" address="unix:///run/containerd/s/884d002401c2f19bb8db1ad2322e7421a6631357913fb2755b334c08790adc09" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:38.746657 sshd[4627]: Connection closed by 10.0.0.1 port 38536 May 27 03:20:38.747467 sshd-session[4625]: pam_unix(sshd:session): session closed for user core May 27 03:20:38.753545 systemd[1]: sshd@9-10.0.0.98:22-10.0.0.1:38536.service: Deactivated successfully. May 27 03:20:38.758591 systemd[1]: session-10.scope: Deactivated successfully. May 27 03:20:38.762175 systemd-logind[1549]: Session 10 logged out. Waiting for processes to exit. May 27 03:20:38.767332 systemd-logind[1549]: Removed session 10. May 27 03:20:38.772731 systemd[1]: Started cri-containerd-3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4.scope - libcontainer container 3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4. May 27 03:20:38.792723 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:38.828193 systemd-networkd[1475]: cali4d16f520a3d: Link UP May 27 03:20:38.828413 systemd-networkd[1475]: cali4d16f520a3d: Gained carrier May 27 03:20:38.839838 containerd[1564]: time="2025-05-27T03:20:38.839794176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-nkjnr,Uid:a883863a-d79b-4a80-911d-ab857d7d891b,Namespace:calico-system,Attempt:0,} returns sandbox id \"3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4\"" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.746 [INFO][4639] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--86zzn-eth0 csi-node-driver- calico-system e359ce44-55b5-46e1-a044-9a6f462b1bb9 682 0 2025-05-27 03:20:02 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-86zzn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4d16f520a3d [] [] }} ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.746 [INFO][4639] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-eth0" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.778 [INFO][4699] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" HandleID="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Workload="localhost-k8s-csi--node--driver--86zzn-eth0" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.778 [INFO][4699] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" HandleID="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Workload="localhost-k8s-csi--node--driver--86zzn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e4c0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-86zzn", "timestamp":"2025-05-27 03:20:38.778344287 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.778 [INFO][4699] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.778 [INFO][4699] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.778 [INFO][4699] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.787 [INFO][4699] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.793 [INFO][4699] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.799 [INFO][4699] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.802 [INFO][4699] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.804 [INFO][4699] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.804 [INFO][4699] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.806 [INFO][4699] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.812 [INFO][4699] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.819 [INFO][4699] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.819 [INFO][4699] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" host="localhost" May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.820 [INFO][4699] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:38.844764 containerd[1564]: 2025-05-27 03:20:38.820 [INFO][4699] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" HandleID="k8s-pod-network.6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Workload="localhost-k8s-csi--node--driver--86zzn-eth0" May 27 03:20:38.846708 containerd[1564]: 2025-05-27 03:20:38.823 [INFO][4639] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--86zzn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e359ce44-55b5-46e1-a044-9a6f462b1bb9", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-86zzn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4d16f520a3d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:38.846708 containerd[1564]: 2025-05-27 03:20:38.823 [INFO][4639] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-eth0" May 27 03:20:38.846708 containerd[1564]: 2025-05-27 03:20:38.823 [INFO][4639] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4d16f520a3d ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-eth0" May 27 03:20:38.846708 containerd[1564]: 2025-05-27 03:20:38.829 [INFO][4639] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-eth0" May 27 03:20:38.846708 containerd[1564]: 2025-05-27 03:20:38.829 [INFO][4639] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--86zzn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e359ce44-55b5-46e1-a044-9a6f462b1bb9", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d", Pod:"csi-node-driver-86zzn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4d16f520a3d", MAC:"32:1f:f4:98:45:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:38.846708 containerd[1564]: 2025-05-27 03:20:38.840 [INFO][4639] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" Namespace="calico-system" Pod="csi-node-driver-86zzn" WorkloadEndpoint="localhost-k8s-csi--node--driver--86zzn-eth0" May 27 03:20:38.868146 containerd[1564]: time="2025-05-27T03:20:38.868080594Z" level=info msg="connecting to shim 6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d" address="unix:///run/containerd/s/d4d2e2a293fed8138f7f9bce2bea6f22cfe69980be39d2f8c955d2bf452fa8c0" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:38.893671 systemd[1]: Started cri-containerd-6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d.scope - libcontainer container 6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d. May 27 03:20:38.909819 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:38.926376 systemd-networkd[1475]: cali5ba59023dbd: Gained IPv6LL May 27 03:20:38.927879 systemd-networkd[1475]: cali18c76d71279: Link UP May 27 03:20:38.928694 systemd-networkd[1475]: cali18c76d71279: Gained carrier May 27 03:20:39.003233 containerd[1564]: time="2025-05-27T03:20:39.003113310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-86zzn,Uid:e359ce44-55b5-46e1-a044-9a6f462b1bb9,Namespace:calico-system,Attempt:0,} returns sandbox id \"6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d\"" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.757 [INFO][4637] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0 calico-apiserver-bfbf6b5b6- calico-apiserver 6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa 786 0 2025-05-27 03:19:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:bfbf6b5b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-bfbf6b5b6-hk6ml eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali18c76d71279 [] [] }} ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.757 [INFO][4637] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.798 [INFO][4710] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" HandleID="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Workload="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.798 [INFO][4710] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" HandleID="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Workload="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad010), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-bfbf6b5b6-hk6ml", "timestamp":"2025-05-27 03:20:38.798000067 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.798 [INFO][4710] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.820 [INFO][4710] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.820 [INFO][4710] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.888 [INFO][4710] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.895 [INFO][4710] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.900 [INFO][4710] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.901 [INFO][4710] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.903 [INFO][4710] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.903 [INFO][4710] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.904 [INFO][4710] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.909 [INFO][4710] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.917 [INFO][4710] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.917 [INFO][4710] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" host="localhost" May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.917 [INFO][4710] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:39.050155 containerd[1564]: 2025-05-27 03:20:38.917 [INFO][4710] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" HandleID="k8s-pod-network.4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Workload="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" May 27 03:20:39.050764 containerd[1564]: 2025-05-27 03:20:38.922 [INFO][4637] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0", GenerateName:"calico-apiserver-bfbf6b5b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bfbf6b5b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-bfbf6b5b6-hk6ml", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali18c76d71279", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:39.050764 containerd[1564]: 2025-05-27 03:20:38.922 [INFO][4637] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" May 27 03:20:39.050764 containerd[1564]: 2025-05-27 03:20:38.922 [INFO][4637] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali18c76d71279 ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" May 27 03:20:39.050764 containerd[1564]: 2025-05-27 03:20:38.929 [INFO][4637] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" May 27 03:20:39.050764 containerd[1564]: 2025-05-27 03:20:38.930 [INFO][4637] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0", GenerateName:"calico-apiserver-bfbf6b5b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bfbf6b5b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c", Pod:"calico-apiserver-bfbf6b5b6-hk6ml", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali18c76d71279", MAC:"7e:8e:03:9f:c6:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:39.050764 containerd[1564]: 2025-05-27 03:20:39.046 [INFO][4637] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" Namespace="calico-apiserver" Pod="calico-apiserver-bfbf6b5b6-hk6ml" WorkloadEndpoint="localhost-k8s-calico--apiserver--bfbf6b5b6--hk6ml-eth0" May 27 03:20:39.255741 containerd[1564]: time="2025-05-27T03:20:39.255601841Z" level=info msg="connecting to shim 4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c" address="unix:///run/containerd/s/21c7c00840131ba847ccf3135fc8d21567bb21f617bc2f7ffd571744b7e72a10" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:39.284605 systemd[1]: Started cri-containerd-4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c.scope - libcontainer container 4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c. May 27 03:20:39.296980 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:39.329396 containerd[1564]: time="2025-05-27T03:20:39.329336280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bfbf6b5b6-hk6ml,Uid:6faf2a5b-9ea6-460d-9973-8f32eb2ae9fa,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c\"" May 27 03:20:39.630500 containerd[1564]: time="2025-05-27T03:20:39.630345390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7f8d9d8-w26p7,Uid:c801dbfb-f6db-41ad-af94-26a4c142cc32,Namespace:calico-system,Attempt:0,}" May 27 03:20:39.884016 systemd-networkd[1475]: calicb7e7d2e434: Link UP May 27 03:20:39.885137 systemd-networkd[1475]: calicb7e7d2e434: Gained carrier May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.749 [INFO][4847] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0 calico-kube-controllers-77c7f8d9d8- calico-system c801dbfb-f6db-41ad-af94-26a4c142cc32 789 0 2025-05-27 03:20:02 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:77c7f8d9d8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-77c7f8d9d8-w26p7 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calicb7e7d2e434 [] [] }} ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.749 [INFO][4847] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.775 [INFO][4861] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" HandleID="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Workload="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.775 [INFO][4861] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" HandleID="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Workload="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad6f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-77c7f8d9d8-w26p7", "timestamp":"2025-05-27 03:20:39.775094042 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.775 [INFO][4861] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.775 [INFO][4861] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.775 [INFO][4861] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.782 [INFO][4861] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.787 [INFO][4861] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.791 [INFO][4861] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.793 [INFO][4861] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.796 [INFO][4861] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.796 [INFO][4861] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.798 [INFO][4861] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.807 [INFO][4861] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.877 [INFO][4861] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.878 [INFO][4861] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" host="localhost" May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.878 [INFO][4861] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:39.922583 containerd[1564]: 2025-05-27 03:20:39.878 [INFO][4861] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" HandleID="k8s-pod-network.169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Workload="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" May 27 03:20:39.923504 containerd[1564]: 2025-05-27 03:20:39.881 [INFO][4847] cni-plugin/k8s.go 418: Populated endpoint ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0", GenerateName:"calico-kube-controllers-77c7f8d9d8-", Namespace:"calico-system", SelfLink:"", UID:"c801dbfb-f6db-41ad-af94-26a4c142cc32", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77c7f8d9d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-77c7f8d9d8-w26p7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calicb7e7d2e434", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:39.923504 containerd[1564]: 2025-05-27 03:20:39.881 [INFO][4847] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" May 27 03:20:39.923504 containerd[1564]: 2025-05-27 03:20:39.881 [INFO][4847] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicb7e7d2e434 ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" May 27 03:20:39.923504 containerd[1564]: 2025-05-27 03:20:39.885 [INFO][4847] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" May 27 03:20:39.923504 containerd[1564]: 2025-05-27 03:20:39.885 [INFO][4847] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0", GenerateName:"calico-kube-controllers-77c7f8d9d8-", Namespace:"calico-system", SelfLink:"", UID:"c801dbfb-f6db-41ad-af94-26a4c142cc32", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 20, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77c7f8d9d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa", Pod:"calico-kube-controllers-77c7f8d9d8-w26p7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calicb7e7d2e434", MAC:"d6:e4:66:ee:2e:f4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:39.923504 containerd[1564]: 2025-05-27 03:20:39.916 [INFO][4847] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" Namespace="calico-system" Pod="calico-kube-controllers-77c7f8d9d8-w26p7" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--77c7f8d9d8--w26p7-eth0" May 27 03:20:40.012613 systemd-networkd[1475]: califad99b8ac90: Gained IPv6LL May 27 03:20:40.140643 systemd-networkd[1475]: cali4d16f520a3d: Gained IPv6LL May 27 03:20:40.383474 containerd[1564]: time="2025-05-27T03:20:40.383395039Z" level=info msg="connecting to shim 169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa" address="unix:///run/containerd/s/9864c1165f121da6df7ca6a39a76375b9fd59771386017159669b9e794e1c645" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:40.419638 systemd[1]: Started cri-containerd-169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa.scope - libcontainer container 169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa. May 27 03:20:40.435416 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:40.524665 systemd-networkd[1475]: cali18c76d71279: Gained IPv6LL May 27 03:20:40.589466 containerd[1564]: time="2025-05-27T03:20:40.589382927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7f8d9d8-w26p7,Uid:c801dbfb-f6db-41ad-af94-26a4c142cc32,Namespace:calico-system,Attempt:0,} returns sandbox id \"169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa\"" May 27 03:20:40.635665 containerd[1564]: time="2025-05-27T03:20:40.635028658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-44gpm,Uid:d9e15705-91b0-4fa0-9d68-b56dc4cb8520,Namespace:kube-system,Attempt:0,}" May 27 03:20:40.916979 systemd-networkd[1475]: cali786c5bc3d4c: Link UP May 27 03:20:40.918524 systemd-networkd[1475]: cali786c5bc3d4c: Gained carrier May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.720 [INFO][4935] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--44gpm-eth0 coredns-668d6bf9bc- kube-system d9e15705-91b0-4fa0-9d68-b56dc4cb8520 792 0 2025-05-27 03:19:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-44gpm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali786c5bc3d4c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.721 [INFO][4935] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.777 [INFO][4950] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" HandleID="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Workload="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.777 [INFO][4950] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" HandleID="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Workload="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f660), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-44gpm", "timestamp":"2025-05-27 03:20:40.777421792 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.777 [INFO][4950] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.777 [INFO][4950] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.777 [INFO][4950] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.790 [INFO][4950] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.817 [INFO][4950] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.826 [INFO][4950] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.829 [INFO][4950] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.832 [INFO][4950] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.833 [INFO][4950] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.835 [INFO][4950] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187 May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.845 [INFO][4950] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.908 [INFO][4950] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.908 [INFO][4950] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" host="localhost" May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.908 [INFO][4950] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:20:40.958310 containerd[1564]: 2025-05-27 03:20:40.908 [INFO][4950] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" HandleID="k8s-pod-network.9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Workload="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" May 27 03:20:40.959183 containerd[1564]: 2025-05-27 03:20:40.912 [INFO][4935] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--44gpm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d9e15705-91b0-4fa0-9d68-b56dc4cb8520", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-44gpm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali786c5bc3d4c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:40.959183 containerd[1564]: 2025-05-27 03:20:40.912 [INFO][4935] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" May 27 03:20:40.959183 containerd[1564]: 2025-05-27 03:20:40.912 [INFO][4935] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali786c5bc3d4c ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" May 27 03:20:40.959183 containerd[1564]: 2025-05-27 03:20:40.919 [INFO][4935] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" May 27 03:20:40.959183 containerd[1564]: 2025-05-27 03:20:40.920 [INFO][4935] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--44gpm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d9e15705-91b0-4fa0-9d68-b56dc4cb8520", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 19, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187", Pod:"coredns-668d6bf9bc-44gpm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali786c5bc3d4c", MAC:"fa:4f:83:4e:24:12", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:20:40.959183 containerd[1564]: 2025-05-27 03:20:40.952 [INFO][4935] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" Namespace="kube-system" Pod="coredns-668d6bf9bc-44gpm" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--44gpm-eth0" May 27 03:20:41.484624 systemd-networkd[1475]: calicb7e7d2e434: Gained IPv6LL May 27 03:20:41.508723 containerd[1564]: time="2025-05-27T03:20:41.508656888Z" level=info msg="connecting to shim 9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187" address="unix:///run/containerd/s/59404f782f6844cd10658e36bc1879bf40447999b5c2fb9286db236686fd69bf" namespace=k8s.io protocol=ttrpc version=3 May 27 03:20:41.541592 systemd[1]: Started cri-containerd-9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187.scope - libcontainer container 9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187. May 27 03:20:41.555109 systemd-resolved[1413]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 03:20:41.559994 containerd[1564]: time="2025-05-27T03:20:41.559922734Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:41.597541 containerd[1564]: time="2025-05-27T03:20:41.597466338Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 27 03:20:41.621609 containerd[1564]: time="2025-05-27T03:20:41.621573277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-44gpm,Uid:d9e15705-91b0-4fa0-9d68-b56dc4cb8520,Namespace:kube-system,Attempt:0,} returns sandbox id \"9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187\"" May 27 03:20:41.657197 containerd[1564]: time="2025-05-27T03:20:41.657149038Z" level=info msg="CreateContainer within sandbox \"9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 03:20:41.671257 containerd[1564]: time="2025-05-27T03:20:41.671173707Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:41.744474 containerd[1564]: time="2025-05-27T03:20:41.744271202Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:41.745247 containerd[1564]: time="2025-05-27T03:20:41.745192330Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 4.467430006s" May 27 03:20:41.745247 containerd[1564]: time="2025-05-27T03:20:41.745242455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 03:20:41.746390 containerd[1564]: time="2025-05-27T03:20:41.746359410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:20:41.748120 containerd[1564]: time="2025-05-27T03:20:41.748081582Z" level=info msg="CreateContainer within sandbox \"287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 03:20:41.788887 containerd[1564]: time="2025-05-27T03:20:41.788826845Z" level=info msg="Container 449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:41.801447 containerd[1564]: time="2025-05-27T03:20:41.801392435Z" level=info msg="Container c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:41.805149 containerd[1564]: time="2025-05-27T03:20:41.805108338Z" level=info msg="CreateContainer within sandbox \"9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0\"" May 27 03:20:41.806035 containerd[1564]: time="2025-05-27T03:20:41.805884595Z" level=info msg="StartContainer for \"449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0\"" May 27 03:20:41.806948 containerd[1564]: time="2025-05-27T03:20:41.806920279Z" level=info msg="connecting to shim 449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0" address="unix:///run/containerd/s/59404f782f6844cd10658e36bc1879bf40447999b5c2fb9286db236686fd69bf" protocol=ttrpc version=3 May 27 03:20:41.809555 containerd[1564]: time="2025-05-27T03:20:41.809418857Z" level=info msg="CreateContainer within sandbox \"287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099\"" May 27 03:20:41.810537 containerd[1564]: time="2025-05-27T03:20:41.810502391Z" level=info msg="StartContainer for \"c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099\"" May 27 03:20:41.811497 containerd[1564]: time="2025-05-27T03:20:41.811404294Z" level=info msg="connecting to shim c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099" address="unix:///run/containerd/s/1bba10e10a4caee62fca7b421b3b91a8fa58ba9c684558091aa6f3c19e57d860" protocol=ttrpc version=3 May 27 03:20:41.830680 systemd[1]: Started cri-containerd-449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0.scope - libcontainer container 449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0. May 27 03:20:41.836163 systemd[1]: Started cri-containerd-c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099.scope - libcontainer container c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099. May 27 03:20:41.875613 containerd[1564]: time="2025-05-27T03:20:41.875561170Z" level=info msg="StartContainer for \"449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0\" returns successfully" May 27 03:20:41.902638 containerd[1564]: time="2025-05-27T03:20:41.902509492Z" level=info msg="StartContainer for \"c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099\" returns successfully" May 27 03:20:41.991893 containerd[1564]: time="2025-05-27T03:20:41.991816575Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:20:42.006470 containerd[1564]: time="2025-05-27T03:20:42.006336152Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:20:42.006470 containerd[1564]: time="2025-05-27T03:20:42.006418436Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:20:42.006774 kubelet[2684]: E0527 03:20:42.006718 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:20:42.007350 kubelet[2684]: E0527 03:20:42.006787 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:20:42.007660 containerd[1564]: time="2025-05-27T03:20:42.007124431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 03:20:42.008864 kubelet[2684]: E0527 03:20:42.008794 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sf2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:20:42.010081 kubelet[2684]: E0527 03:20:42.010011 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:20:42.828886 systemd-networkd[1475]: cali786c5bc3d4c: Gained IPv6LL May 27 03:20:42.879756 kubelet[2684]: E0527 03:20:42.879582 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:20:42.889027 kubelet[2684]: I0527 03:20:42.888940 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-7ws6p" podStartSLOduration=39.420057734 podStartE2EDuration="43.888915509s" podCreationTimestamp="2025-05-27 03:19:59 +0000 UTC" firstStartedPulling="2025-05-27 03:20:37.277356242 +0000 UTC m=+52.735396469" lastFinishedPulling="2025-05-27 03:20:41.746214017 +0000 UTC m=+57.204254244" observedRunningTime="2025-05-27 03:20:42.887593067 +0000 UTC m=+58.345633304" watchObservedRunningTime="2025-05-27 03:20:42.888915509 +0000 UTC m=+58.346955736" May 27 03:20:42.911587 kubelet[2684]: I0527 03:20:42.911497 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-44gpm" podStartSLOduration=53.911468371 podStartE2EDuration="53.911468371s" podCreationTimestamp="2025-05-27 03:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:20:42.909808115 +0000 UTC m=+58.367848343" watchObservedRunningTime="2025-05-27 03:20:42.911468371 +0000 UTC m=+58.369508588" May 27 03:20:43.762234 systemd[1]: Started sshd@10-10.0.0.98:22-10.0.0.1:39752.service - OpenSSH per-connection server daemon (10.0.0.1:39752). May 27 03:20:43.832133 sshd[5099]: Accepted publickey for core from 10.0.0.1 port 39752 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:43.834021 sshd-session[5099]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:43.838913 systemd-logind[1549]: New session 11 of user core. May 27 03:20:43.848594 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 03:20:43.880127 kubelet[2684]: I0527 03:20:43.880085 2684 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:20:44.029083 sshd[5101]: Connection closed by 10.0.0.1 port 39752 May 27 03:20:44.029387 sshd-session[5099]: pam_unix(sshd:session): session closed for user core May 27 03:20:44.042556 systemd[1]: sshd@10-10.0.0.98:22-10.0.0.1:39752.service: Deactivated successfully. May 27 03:20:44.044667 systemd[1]: session-11.scope: Deactivated successfully. May 27 03:20:44.045480 systemd-logind[1549]: Session 11 logged out. Waiting for processes to exit. May 27 03:20:44.049013 systemd[1]: Started sshd@11-10.0.0.98:22-10.0.0.1:39768.service - OpenSSH per-connection server daemon (10.0.0.1:39768). May 27 03:20:44.049660 systemd-logind[1549]: Removed session 11. May 27 03:20:44.110499 sshd[5115]: Accepted publickey for core from 10.0.0.1 port 39768 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:44.112013 sshd-session[5115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:44.116830 systemd-logind[1549]: New session 12 of user core. May 27 03:20:44.127584 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 03:20:44.341140 sshd[5117]: Connection closed by 10.0.0.1 port 39768 May 27 03:20:44.357697 systemd[1]: Started sshd@12-10.0.0.98:22-10.0.0.1:39780.service - OpenSSH per-connection server daemon (10.0.0.1:39780). May 27 03:20:44.408165 sshd[5129]: Accepted publickey for core from 10.0.0.1 port 39780 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:44.424025 sshd-session[5115]: pam_unix(sshd:session): session closed for user core May 27 03:20:44.424286 sshd-session[5129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:44.429738 systemd[1]: sshd@11-10.0.0.98:22-10.0.0.1:39768.service: Deactivated successfully. May 27 03:20:44.433131 systemd[1]: session-12.scope: Deactivated successfully. May 27 03:20:44.436247 containerd[1564]: time="2025-05-27T03:20:44.436201992Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:44.437177 systemd-logind[1549]: New session 13 of user core. May 27 03:20:44.438286 containerd[1564]: time="2025-05-27T03:20:44.437502733Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 27 03:20:44.440638 containerd[1564]: time="2025-05-27T03:20:44.440516548Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:44.444618 containerd[1564]: time="2025-05-27T03:20:44.444567940Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:44.445874 containerd[1564]: time="2025-05-27T03:20:44.445828636Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 2.438678647s" May 27 03:20:44.445874 containerd[1564]: time="2025-05-27T03:20:44.445860586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 27 03:20:44.447014 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 03:20:44.448848 containerd[1564]: time="2025-05-27T03:20:44.448803327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 03:20:44.451932 containerd[1564]: time="2025-05-27T03:20:44.451859843Z" level=info msg="CreateContainer within sandbox \"6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 03:20:44.452279 systemd-logind[1549]: Session 12 logged out. Waiting for processes to exit. May 27 03:20:44.455108 systemd-logind[1549]: Removed session 12. May 27 03:20:44.471641 containerd[1564]: time="2025-05-27T03:20:44.471578122Z" level=info msg="Container fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:44.481694 containerd[1564]: time="2025-05-27T03:20:44.481623873Z" level=info msg="CreateContainer within sandbox \"6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a\"" May 27 03:20:44.482273 containerd[1564]: time="2025-05-27T03:20:44.482240570Z" level=info msg="StartContainer for \"fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a\"" May 27 03:20:44.489171 containerd[1564]: time="2025-05-27T03:20:44.489120089Z" level=info msg="connecting to shim fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a" address="unix:///run/containerd/s/d4d2e2a293fed8138f7f9bce2bea6f22cfe69980be39d2f8c955d2bf452fa8c0" protocol=ttrpc version=3 May 27 03:20:44.513683 systemd[1]: Started cri-containerd-fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a.scope - libcontainer container fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a. May 27 03:20:44.587158 sshd[5134]: Connection closed by 10.0.0.1 port 39780 May 27 03:20:44.605861 sshd-session[5129]: pam_unix(sshd:session): session closed for user core May 27 03:20:44.610096 containerd[1564]: time="2025-05-27T03:20:44.610061448Z" level=info msg="StartContainer for \"fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a\" returns successfully" May 27 03:20:44.611019 systemd[1]: sshd@12-10.0.0.98:22-10.0.0.1:39780.service: Deactivated successfully. May 27 03:20:44.614068 systemd[1]: session-13.scope: Deactivated successfully. May 27 03:20:44.615856 systemd-logind[1549]: Session 13 logged out. Waiting for processes to exit. May 27 03:20:44.617274 systemd-logind[1549]: Removed session 13. May 27 03:20:44.839134 containerd[1564]: time="2025-05-27T03:20:44.839053078Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:44.839840 containerd[1564]: time="2025-05-27T03:20:44.839791213Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 03:20:44.841493 containerd[1564]: time="2025-05-27T03:20:44.841458331Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 392.59408ms" May 27 03:20:44.841493 containerd[1564]: time="2025-05-27T03:20:44.841492445Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 03:20:44.842765 containerd[1564]: time="2025-05-27T03:20:44.842738203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 03:20:44.844896 containerd[1564]: time="2025-05-27T03:20:44.844094228Z" level=info msg="CreateContainer within sandbox \"4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 03:20:44.853619 containerd[1564]: time="2025-05-27T03:20:44.853563917Z" level=info msg="Container 9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:44.863267 containerd[1564]: time="2025-05-27T03:20:44.862827069Z" level=info msg="CreateContainer within sandbox \"4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306\"" May 27 03:20:44.863581 containerd[1564]: time="2025-05-27T03:20:44.863488580Z" level=info msg="StartContainer for \"9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306\"" May 27 03:20:44.870334 containerd[1564]: time="2025-05-27T03:20:44.870279382Z" level=info msg="connecting to shim 9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306" address="unix:///run/containerd/s/21c7c00840131ba847ccf3135fc8d21567bb21f617bc2f7ffd571744b7e72a10" protocol=ttrpc version=3 May 27 03:20:44.895577 systemd[1]: Started cri-containerd-9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306.scope - libcontainer container 9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306. May 27 03:20:44.942200 containerd[1564]: time="2025-05-27T03:20:44.941890797Z" level=info msg="StartContainer for \"9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306\" returns successfully" May 27 03:20:45.924982 kubelet[2684]: I0527 03:20:45.924895 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-bfbf6b5b6-hk6ml" podStartSLOduration=41.412816405 podStartE2EDuration="46.92486606s" podCreationTimestamp="2025-05-27 03:19:59 +0000 UTC" firstStartedPulling="2025-05-27 03:20:39.33049789 +0000 UTC m=+54.788538117" lastFinishedPulling="2025-05-27 03:20:44.842547545 +0000 UTC m=+60.300587772" observedRunningTime="2025-05-27 03:20:45.924326578 +0000 UTC m=+61.382366805" watchObservedRunningTime="2025-05-27 03:20:45.92486606 +0000 UTC m=+61.382906287" May 27 03:20:46.893246 kubelet[2684]: I0527 03:20:46.893201 2684 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:20:48.165644 containerd[1564]: time="2025-05-27T03:20:48.165573864Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:48.167112 containerd[1564]: time="2025-05-27T03:20:48.167080559Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 27 03:20:48.168450 containerd[1564]: time="2025-05-27T03:20:48.168377631Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:48.170523 containerd[1564]: time="2025-05-27T03:20:48.170472796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:48.171244 containerd[1564]: time="2025-05-27T03:20:48.171199603Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 3.32842958s" May 27 03:20:48.171244 containerd[1564]: time="2025-05-27T03:20:48.171242526Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 27 03:20:48.173495 containerd[1564]: time="2025-05-27T03:20:48.173450327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:20:48.187908 containerd[1564]: time="2025-05-27T03:20:48.187844681Z" level=info msg="CreateContainer within sandbox \"169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 03:20:48.197177 containerd[1564]: time="2025-05-27T03:20:48.197112517Z" level=info msg="Container b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:48.205994 containerd[1564]: time="2025-05-27T03:20:48.205936339Z" level=info msg="CreateContainer within sandbox \"169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\"" May 27 03:20:48.206771 containerd[1564]: time="2025-05-27T03:20:48.206732920Z" level=info msg="StartContainer for \"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\"" May 27 03:20:48.207917 containerd[1564]: time="2025-05-27T03:20:48.207890454Z" level=info msg="connecting to shim b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956" address="unix:///run/containerd/s/9864c1165f121da6df7ca6a39a76375b9fd59771386017159669b9e794e1c645" protocol=ttrpc version=3 May 27 03:20:48.234579 systemd[1]: Started cri-containerd-b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956.scope - libcontainer container b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956. May 27 03:20:48.354149 containerd[1564]: time="2025-05-27T03:20:48.354088671Z" level=info msg="StartContainer for \"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" returns successfully" May 27 03:20:48.424797 containerd[1564]: time="2025-05-27T03:20:48.424641618Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:20:48.426129 containerd[1564]: time="2025-05-27T03:20:48.426079130Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:20:48.426254 containerd[1564]: time="2025-05-27T03:20:48.426174202Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:20:48.426465 kubelet[2684]: E0527 03:20:48.426390 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:20:48.426904 kubelet[2684]: E0527 03:20:48.426486 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:20:48.426904 kubelet[2684]: E0527 03:20:48.426721 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:767452f2f0924610a3259c1e7259b93b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:20:48.427008 containerd[1564]: time="2025-05-27T03:20:48.426892412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 03:20:48.910808 kubelet[2684]: I0527 03:20:48.910572 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-77c7f8d9d8-w26p7" podStartSLOduration=39.328400328 podStartE2EDuration="46.91055123s" podCreationTimestamp="2025-05-27 03:20:02 +0000 UTC" firstStartedPulling="2025-05-27 03:20:40.591056909 +0000 UTC m=+56.049097136" lastFinishedPulling="2025-05-27 03:20:48.173207821 +0000 UTC m=+63.631248038" observedRunningTime="2025-05-27 03:20:48.90982202 +0000 UTC m=+64.367862247" watchObservedRunningTime="2025-05-27 03:20:48.91055123 +0000 UTC m=+64.368591457" May 27 03:20:48.951846 containerd[1564]: time="2025-05-27T03:20:48.951790911Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"0bfdc3863a16464440077c6a205a42e8a5401c8b52f7c97419d6b8de98b77d15\" pid:5284 exited_at:{seconds:1748316048 nanos:951140781}" May 27 03:20:49.602995 systemd[1]: Started sshd@13-10.0.0.98:22-10.0.0.1:39784.service - OpenSSH per-connection server daemon (10.0.0.1:39784). May 27 03:20:49.665135 sshd[5297]: Accepted publickey for core from 10.0.0.1 port 39784 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:49.666908 sshd-session[5297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:49.671479 systemd-logind[1549]: New session 14 of user core. May 27 03:20:49.678645 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 03:20:49.814303 sshd[5299]: Connection closed by 10.0.0.1 port 39784 May 27 03:20:49.814682 sshd-session[5297]: pam_unix(sshd:session): session closed for user core May 27 03:20:49.819307 systemd[1]: sshd@13-10.0.0.98:22-10.0.0.1:39784.service: Deactivated successfully. May 27 03:20:49.821669 systemd[1]: session-14.scope: Deactivated successfully. May 27 03:20:49.822693 systemd-logind[1549]: Session 14 logged out. Waiting for processes to exit. May 27 03:20:49.824672 systemd-logind[1549]: Removed session 14. May 27 03:20:50.484738 containerd[1564]: time="2025-05-27T03:20:50.484660254Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:50.485392 containerd[1564]: time="2025-05-27T03:20:50.485328587Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 27 03:20:50.486662 containerd[1564]: time="2025-05-27T03:20:50.486603453Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:50.488539 containerd[1564]: time="2025-05-27T03:20:50.488468190Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:20:50.489083 containerd[1564]: time="2025-05-27T03:20:50.489037994Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 2.062107688s" May 27 03:20:50.489083 containerd[1564]: time="2025-05-27T03:20:50.489070957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 27 03:20:50.492283 containerd[1564]: time="2025-05-27T03:20:50.492243605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:20:50.505314 containerd[1564]: time="2025-05-27T03:20:50.505247996Z" level=info msg="CreateContainer within sandbox \"6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 03:20:50.516521 containerd[1564]: time="2025-05-27T03:20:50.516462022Z" level=info msg="Container 664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0: CDI devices from CRI Config.CDIDevices: []" May 27 03:20:50.527247 containerd[1564]: time="2025-05-27T03:20:50.527192412Z" level=info msg="CreateContainer within sandbox \"6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0\"" May 27 03:20:50.528009 containerd[1564]: time="2025-05-27T03:20:50.527737106Z" level=info msg="StartContainer for \"664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0\"" May 27 03:20:50.529522 containerd[1564]: time="2025-05-27T03:20:50.529493306Z" level=info msg="connecting to shim 664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0" address="unix:///run/containerd/s/d4d2e2a293fed8138f7f9bce2bea6f22cfe69980be39d2f8c955d2bf452fa8c0" protocol=ttrpc version=3 May 27 03:20:50.557680 systemd[1]: Started cri-containerd-664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0.scope - libcontainer container 664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0. May 27 03:20:50.603830 containerd[1564]: time="2025-05-27T03:20:50.603781344Z" level=info msg="StartContainer for \"664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0\" returns successfully" May 27 03:20:50.720803 containerd[1564]: time="2025-05-27T03:20:50.720710438Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:20:50.722033 containerd[1564]: time="2025-05-27T03:20:50.721971869Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:20:50.722246 containerd[1564]: time="2025-05-27T03:20:50.722081929Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:20:50.722326 kubelet[2684]: E0527 03:20:50.722260 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:20:50.722804 kubelet[2684]: E0527 03:20:50.722339 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:20:50.724758 kubelet[2684]: E0527 03:20:50.724687 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:20:50.726568 kubelet[2684]: E0527 03:20:50.726505 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:20:50.769664 kubelet[2684]: I0527 03:20:50.769490 2684 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 03:20:50.769664 kubelet[2684]: I0527 03:20:50.769532 2684 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 03:20:51.114487 kubelet[2684]: I0527 03:20:51.114178 2684 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-86zzn" podStartSLOduration=37.627061881 podStartE2EDuration="49.11415853s" podCreationTimestamp="2025-05-27 03:20:02 +0000 UTC" firstStartedPulling="2025-05-27 03:20:39.004933346 +0000 UTC m=+54.462973573" lastFinishedPulling="2025-05-27 03:20:50.492029995 +0000 UTC m=+65.950070222" observedRunningTime="2025-05-27 03:20:51.113586714 +0000 UTC m=+66.571626961" watchObservedRunningTime="2025-05-27 03:20:51.11415853 +0000 UTC m=+66.572198757" May 27 03:20:52.759545 kubelet[2684]: I0527 03:20:52.759497 2684 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:20:54.828398 systemd[1]: Started sshd@14-10.0.0.98:22-10.0.0.1:44510.service - OpenSSH per-connection server daemon (10.0.0.1:44510). May 27 03:20:54.900841 sshd[5362]: Accepted publickey for core from 10.0.0.1 port 44510 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:20:54.902326 sshd-session[5362]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:20:54.906789 systemd-logind[1549]: New session 15 of user core. May 27 03:20:54.915582 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 03:20:55.042312 sshd[5364]: Connection closed by 10.0.0.1 port 44510 May 27 03:20:55.042702 sshd-session[5362]: pam_unix(sshd:session): session closed for user core May 27 03:20:55.047753 systemd[1]: sshd@14-10.0.0.98:22-10.0.0.1:44510.service: Deactivated successfully. May 27 03:20:55.050171 systemd[1]: session-15.scope: Deactivated successfully. May 27 03:20:55.051129 systemd-logind[1549]: Session 15 logged out. Waiting for processes to exit. May 27 03:20:55.053115 systemd-logind[1549]: Removed session 15. May 27 03:20:55.630802 containerd[1564]: time="2025-05-27T03:20:55.630741609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:20:55.874415 containerd[1564]: time="2025-05-27T03:20:55.874360689Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:20:55.882570 containerd[1564]: time="2025-05-27T03:20:55.882448411Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:20:55.882570 containerd[1564]: time="2025-05-27T03:20:55.882541579Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:20:55.882728 kubelet[2684]: E0527 03:20:55.882673 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:20:55.882728 kubelet[2684]: E0527 03:20:55.882723 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:20:55.883132 kubelet[2684]: E0527 03:20:55.882870 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sf2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:20:55.884139 kubelet[2684]: E0527 03:20:55.884074 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:20:58.912714 containerd[1564]: time="2025-05-27T03:20:58.912658200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"e91d759f4904b7997adf7a8c50808c1fc8ef1f9e6e09c611a8f7c0857df9da97\" pid:5390 exited_at:{seconds:1748316058 nanos:912222528}" May 27 03:21:00.062663 systemd[1]: Started sshd@15-10.0.0.98:22-10.0.0.1:44526.service - OpenSSH per-connection server daemon (10.0.0.1:44526). May 27 03:21:00.124817 sshd[5403]: Accepted publickey for core from 10.0.0.1 port 44526 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:00.126577 sshd-session[5403]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:00.131278 systemd-logind[1549]: New session 16 of user core. May 27 03:21:00.138590 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 03:21:00.265783 sshd[5405]: Connection closed by 10.0.0.1 port 44526 May 27 03:21:00.266767 sshd-session[5403]: pam_unix(sshd:session): session closed for user core May 27 03:21:00.272693 systemd-logind[1549]: Session 16 logged out. Waiting for processes to exit. May 27 03:21:00.273509 systemd[1]: sshd@15-10.0.0.98:22-10.0.0.1:44526.service: Deactivated successfully. May 27 03:21:00.275777 systemd[1]: session-16.scope: Deactivated successfully. May 27 03:21:00.277800 systemd-logind[1549]: Removed session 16. May 27 03:21:01.631998 kubelet[2684]: E0527 03:21:01.631938 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:21:05.280375 systemd[1]: Started sshd@16-10.0.0.98:22-10.0.0.1:35814.service - OpenSSH per-connection server daemon (10.0.0.1:35814). May 27 03:21:05.346426 sshd[5421]: Accepted publickey for core from 10.0.0.1 port 35814 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:05.348087 sshd-session[5421]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:05.352431 systemd-logind[1549]: New session 17 of user core. May 27 03:21:05.362595 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 03:21:05.496247 sshd[5423]: Connection closed by 10.0.0.1 port 35814 May 27 03:21:05.496601 sshd-session[5421]: pam_unix(sshd:session): session closed for user core May 27 03:21:05.500582 systemd[1]: sshd@16-10.0.0.98:22-10.0.0.1:35814.service: Deactivated successfully. May 27 03:21:05.502752 systemd[1]: session-17.scope: Deactivated successfully. May 27 03:21:05.503630 systemd-logind[1549]: Session 17 logged out. Waiting for processes to exit. May 27 03:21:05.505043 systemd-logind[1549]: Removed session 17. May 27 03:21:05.581181 kubelet[2684]: I0527 03:21:05.581047 2684 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:21:09.631539 kubelet[2684]: E0527 03:21:09.631426 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:21:10.517540 systemd[1]: Started sshd@17-10.0.0.98:22-10.0.0.1:35818.service - OpenSSH per-connection server daemon (10.0.0.1:35818). May 27 03:21:10.578764 sshd[5446]: Accepted publickey for core from 10.0.0.1 port 35818 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:10.580293 sshd-session[5446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:10.585046 systemd-logind[1549]: New session 18 of user core. May 27 03:21:10.594567 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 03:21:10.707666 sshd[5448]: Connection closed by 10.0.0.1 port 35818 May 27 03:21:10.708067 sshd-session[5446]: pam_unix(sshd:session): session closed for user core May 27 03:21:10.713384 systemd[1]: sshd@17-10.0.0.98:22-10.0.0.1:35818.service: Deactivated successfully. May 27 03:21:10.716271 systemd[1]: session-18.scope: Deactivated successfully. May 27 03:21:10.717230 systemd-logind[1549]: Session 18 logged out. Waiting for processes to exit. May 27 03:21:10.718895 systemd-logind[1549]: Removed session 18. May 27 03:21:15.630869 containerd[1564]: time="2025-05-27T03:21:15.630797650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:21:15.732564 systemd[1]: Started sshd@18-10.0.0.98:22-10.0.0.1:44732.service - OpenSSH per-connection server daemon (10.0.0.1:44732). May 27 03:21:15.782158 sshd[5461]: Accepted publickey for core from 10.0.0.1 port 44732 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:15.783709 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:15.788070 systemd-logind[1549]: New session 19 of user core. May 27 03:21:15.799582 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 03:21:15.922958 containerd[1564]: time="2025-05-27T03:21:15.922507946Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:21:15.923719 containerd[1564]: time="2025-05-27T03:21:15.923680592Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:21:15.923806 containerd[1564]: time="2025-05-27T03:21:15.923758259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:21:15.924033 kubelet[2684]: E0527 03:21:15.923946 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:21:15.924033 kubelet[2684]: E0527 03:21:15.924032 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:21:15.924743 kubelet[2684]: E0527 03:21:15.924155 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:767452f2f0924610a3259c1e7259b93b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:21:15.927144 containerd[1564]: time="2025-05-27T03:21:15.927097076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:21:15.929546 sshd[5463]: Connection closed by 10.0.0.1 port 44732 May 27 03:21:15.930277 sshd-session[5461]: pam_unix(sshd:session): session closed for user core May 27 03:21:15.935947 systemd-logind[1549]: Session 19 logged out. Waiting for processes to exit. May 27 03:21:15.936135 systemd[1]: sshd@18-10.0.0.98:22-10.0.0.1:44732.service: Deactivated successfully. May 27 03:21:15.938647 systemd[1]: session-19.scope: Deactivated successfully. May 27 03:21:15.940524 systemd-logind[1549]: Removed session 19. May 27 03:21:16.170398 containerd[1564]: time="2025-05-27T03:21:16.170338360Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:21:16.171684 containerd[1564]: time="2025-05-27T03:21:16.171652604Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:21:16.171775 containerd[1564]: time="2025-05-27T03:21:16.171719060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:21:16.171930 kubelet[2684]: E0527 03:21:16.171877 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:21:16.172022 kubelet[2684]: E0527 03:21:16.171939 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:21:16.172159 kubelet[2684]: E0527 03:21:16.172117 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:21:16.173452 kubelet[2684]: E0527 03:21:16.173323 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:21:18.951591 containerd[1564]: time="2025-05-27T03:21:18.951530312Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"43c0587bcbc315a32695642477887b252ad6db3cfa1ad6dcbb89bf15b0e8a9fd\" pid:5487 exited_at:{seconds:1748316078 nanos:951094965}" May 27 03:21:20.942849 systemd[1]: Started sshd@19-10.0.0.98:22-10.0.0.1:44744.service - OpenSSH per-connection server daemon (10.0.0.1:44744). May 27 03:21:21.009563 sshd[5501]: Accepted publickey for core from 10.0.0.1 port 44744 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:21.011295 sshd-session[5501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:21.017473 systemd-logind[1549]: New session 20 of user core. May 27 03:21:21.024563 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 03:21:21.153896 sshd[5505]: Connection closed by 10.0.0.1 port 44744 May 27 03:21:21.154213 sshd-session[5501]: pam_unix(sshd:session): session closed for user core May 27 03:21:21.159010 systemd[1]: sshd@19-10.0.0.98:22-10.0.0.1:44744.service: Deactivated successfully. May 27 03:21:21.161012 systemd[1]: session-20.scope: Deactivated successfully. May 27 03:21:21.161786 systemd-logind[1549]: Session 20 logged out. Waiting for processes to exit. May 27 03:21:21.162984 systemd-logind[1549]: Removed session 20. May 27 03:21:23.630969 containerd[1564]: time="2025-05-27T03:21:23.630878583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:21:23.918373 containerd[1564]: time="2025-05-27T03:21:23.918322796Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:21:23.948496 containerd[1564]: time="2025-05-27T03:21:23.948416295Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:21:23.948647 containerd[1564]: time="2025-05-27T03:21:23.948530071Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:21:23.948851 kubelet[2684]: E0527 03:21:23.948772 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:21:23.949239 kubelet[2684]: E0527 03:21:23.948859 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:21:23.949239 kubelet[2684]: E0527 03:21:23.949021 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sf2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:21:23.950460 kubelet[2684]: E0527 03:21:23.950368 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:21:25.814650 containerd[1564]: time="2025-05-27T03:21:25.814598682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"a32784c804cdee2418ebf7d4ffde63f53ba05ffd9f29daa14192a97e03e53f6c\" pid:5531 exited_at:{seconds:1748316085 nanos:814395538}" May 27 03:21:26.166326 systemd[1]: Started sshd@20-10.0.0.98:22-10.0.0.1:47654.service - OpenSSH per-connection server daemon (10.0.0.1:47654). May 27 03:21:26.235143 sshd[5542]: Accepted publickey for core from 10.0.0.1 port 47654 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:26.236979 sshd-session[5542]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:26.245905 systemd-logind[1549]: New session 21 of user core. May 27 03:21:26.254638 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 03:21:26.422592 sshd[5544]: Connection closed by 10.0.0.1 port 47654 May 27 03:21:26.423188 sshd-session[5542]: pam_unix(sshd:session): session closed for user core May 27 03:21:26.428083 systemd[1]: sshd@20-10.0.0.98:22-10.0.0.1:47654.service: Deactivated successfully. May 27 03:21:26.430688 systemd[1]: session-21.scope: Deactivated successfully. May 27 03:21:26.431735 systemd-logind[1549]: Session 21 logged out. Waiting for processes to exit. May 27 03:21:26.433624 systemd-logind[1549]: Removed session 21. May 27 03:21:28.927458 containerd[1564]: time="2025-05-27T03:21:28.927397395Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"db2f9b07fa3c827f31560c50012ad49cb1f60b9d3c4d352dcd458b3cddf0a01d\" pid:5570 exited_at:{seconds:1748316088 nanos:927051981}" May 27 03:21:31.439162 systemd[1]: Started sshd@21-10.0.0.98:22-10.0.0.1:47662.service - OpenSSH per-connection server daemon (10.0.0.1:47662). May 27 03:21:31.521373 sshd[5583]: Accepted publickey for core from 10.0.0.1 port 47662 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:31.521978 sshd-session[5583]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:31.526530 systemd-logind[1549]: New session 22 of user core. May 27 03:21:31.535580 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 03:21:31.633248 kubelet[2684]: E0527 03:21:31.633137 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:21:31.691657 sshd[5585]: Connection closed by 10.0.0.1 port 47662 May 27 03:21:31.692377 sshd-session[5583]: pam_unix(sshd:session): session closed for user core May 27 03:21:31.698396 systemd[1]: sshd@21-10.0.0.98:22-10.0.0.1:47662.service: Deactivated successfully. May 27 03:21:31.701532 systemd[1]: session-22.scope: Deactivated successfully. May 27 03:21:31.702369 systemd-logind[1549]: Session 22 logged out. Waiting for processes to exit. May 27 03:21:31.703768 systemd-logind[1549]: Removed session 22. May 27 03:21:35.630750 kubelet[2684]: E0527 03:21:35.630573 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:21:36.703490 systemd[1]: Started sshd@22-10.0.0.98:22-10.0.0.1:46820.service - OpenSSH per-connection server daemon (10.0.0.1:46820). May 27 03:21:36.767463 sshd[5599]: Accepted publickey for core from 10.0.0.1 port 46820 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:36.770895 sshd-session[5599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:36.782171 systemd-logind[1549]: New session 23 of user core. May 27 03:21:36.786585 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 03:21:36.948725 sshd[5601]: Connection closed by 10.0.0.1 port 46820 May 27 03:21:36.949070 sshd-session[5599]: pam_unix(sshd:session): session closed for user core May 27 03:21:36.954383 systemd[1]: sshd@22-10.0.0.98:22-10.0.0.1:46820.service: Deactivated successfully. May 27 03:21:36.956751 systemd[1]: session-23.scope: Deactivated successfully. May 27 03:21:36.957668 systemd-logind[1549]: Session 23 logged out. Waiting for processes to exit. May 27 03:21:36.959428 systemd-logind[1549]: Removed session 23. May 27 03:21:41.966150 systemd[1]: Started sshd@23-10.0.0.98:22-10.0.0.1:46822.service - OpenSSH per-connection server daemon (10.0.0.1:46822). May 27 03:21:42.019710 sshd[5617]: Accepted publickey for core from 10.0.0.1 port 46822 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:42.021725 sshd-session[5617]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:42.026551 systemd-logind[1549]: New session 24 of user core. May 27 03:21:42.032861 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 03:21:42.158483 sshd[5619]: Connection closed by 10.0.0.1 port 46822 May 27 03:21:42.158928 sshd-session[5617]: pam_unix(sshd:session): session closed for user core May 27 03:21:42.163814 systemd-logind[1549]: Session 24 logged out. Waiting for processes to exit. May 27 03:21:42.164028 systemd[1]: sshd@23-10.0.0.98:22-10.0.0.1:46822.service: Deactivated successfully. May 27 03:21:42.166254 systemd[1]: session-24.scope: Deactivated successfully. May 27 03:21:42.168305 systemd-logind[1549]: Removed session 24. May 27 03:21:46.638154 kubelet[2684]: E0527 03:21:46.637943 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:21:47.175824 systemd[1]: Started sshd@24-10.0.0.98:22-10.0.0.1:35724.service - OpenSSH per-connection server daemon (10.0.0.1:35724). May 27 03:21:47.229153 sshd[5636]: Accepted publickey for core from 10.0.0.1 port 35724 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:47.231099 sshd-session[5636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:47.236159 systemd-logind[1549]: New session 25 of user core. May 27 03:21:47.246603 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 03:21:47.378267 sshd[5638]: Connection closed by 10.0.0.1 port 35724 May 27 03:21:47.378753 sshd-session[5636]: pam_unix(sshd:session): session closed for user core May 27 03:21:47.384106 systemd[1]: sshd@24-10.0.0.98:22-10.0.0.1:35724.service: Deactivated successfully. May 27 03:21:47.386399 systemd[1]: session-25.scope: Deactivated successfully. May 27 03:21:47.387195 systemd-logind[1549]: Session 25 logged out. Waiting for processes to exit. May 27 03:21:47.388554 systemd-logind[1549]: Removed session 25. May 27 03:21:48.630927 kubelet[2684]: E0527 03:21:48.630846 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:21:48.991049 containerd[1564]: time="2025-05-27T03:21:48.990973329Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"c2585b2914ee34c83369f617f62549d9f528807a59d072b1ecfa2431a701f716\" pid:5663 exited_at:{seconds:1748316108 nanos:990131470}" May 27 03:21:52.395571 systemd[1]: Started sshd@25-10.0.0.98:22-10.0.0.1:35740.service - OpenSSH per-connection server daemon (10.0.0.1:35740). May 27 03:21:52.475589 sshd[5683]: Accepted publickey for core from 10.0.0.1 port 35740 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:52.477729 sshd-session[5683]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:52.487789 systemd-logind[1549]: New session 26 of user core. May 27 03:21:52.494803 systemd[1]: Started session-26.scope - Session 26 of User core. May 27 03:21:52.742339 sshd[5685]: Connection closed by 10.0.0.1 port 35740 May 27 03:21:52.742815 sshd-session[5683]: pam_unix(sshd:session): session closed for user core May 27 03:21:52.749151 systemd-logind[1549]: Session 26 logged out. Waiting for processes to exit. May 27 03:21:52.749571 systemd[1]: sshd@25-10.0.0.98:22-10.0.0.1:35740.service: Deactivated successfully. May 27 03:21:52.752267 systemd[1]: session-26.scope: Deactivated successfully. May 27 03:21:52.754508 systemd-logind[1549]: Removed session 26. May 27 03:21:57.757957 systemd[1]: Started sshd@26-10.0.0.98:22-10.0.0.1:34864.service - OpenSSH per-connection server daemon (10.0.0.1:34864). May 27 03:21:57.815669 sshd[5699]: Accepted publickey for core from 10.0.0.1 port 34864 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:21:57.817542 sshd-session[5699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:21:57.822518 systemd-logind[1549]: New session 27 of user core. May 27 03:21:57.838714 systemd[1]: Started session-27.scope - Session 27 of User core. May 27 03:21:57.961933 sshd[5701]: Connection closed by 10.0.0.1 port 34864 May 27 03:21:57.962253 sshd-session[5699]: pam_unix(sshd:session): session closed for user core May 27 03:21:57.967108 systemd[1]: sshd@26-10.0.0.98:22-10.0.0.1:34864.service: Deactivated successfully. May 27 03:21:57.969571 systemd[1]: session-27.scope: Deactivated successfully. May 27 03:21:57.970534 systemd-logind[1549]: Session 27 logged out. Waiting for processes to exit. May 27 03:21:57.972034 systemd-logind[1549]: Removed session 27. May 27 03:21:58.914966 containerd[1564]: time="2025-05-27T03:21:58.914894865Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"f05dd26558b3399049a01dd95b8be2d35a1c1ceba8df57c5488b97b940e57cc7\" pid:5726 exited_at:{seconds:1748316118 nanos:914501814}" May 27 03:22:00.632108 containerd[1564]: time="2025-05-27T03:22:00.632035126Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:22:00.867622 containerd[1564]: time="2025-05-27T03:22:00.867555404Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:22:00.868695 containerd[1564]: time="2025-05-27T03:22:00.868660176Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:22:00.868801 containerd[1564]: time="2025-05-27T03:22:00.868743203Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:22:00.869021 kubelet[2684]: E0527 03:22:00.868953 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:22:00.869531 kubelet[2684]: E0527 03:22:00.869036 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:22:00.869531 kubelet[2684]: E0527 03:22:00.869183 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:767452f2f0924610a3259c1e7259b93b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:22:00.871217 containerd[1564]: time="2025-05-27T03:22:00.871190296Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:22:01.122800 containerd[1564]: time="2025-05-27T03:22:01.122719082Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:22:01.124051 containerd[1564]: time="2025-05-27T03:22:01.123994655Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:22:01.124156 containerd[1564]: time="2025-05-27T03:22:01.124101627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:22:01.125664 kubelet[2684]: E0527 03:22:01.125611 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:22:01.126012 kubelet[2684]: E0527 03:22:01.125674 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:22:01.126012 kubelet[2684]: E0527 03:22:01.125800 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:22:01.127786 kubelet[2684]: E0527 03:22:01.127688 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:22:02.630787 kubelet[2684]: E0527 03:22:02.630718 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:22:02.976080 systemd[1]: Started sshd@27-10.0.0.98:22-10.0.0.1:34866.service - OpenSSH per-connection server daemon (10.0.0.1:34866). May 27 03:22:03.030386 sshd[5762]: Accepted publickey for core from 10.0.0.1 port 34866 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:03.032621 sshd-session[5762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:03.038061 systemd-logind[1549]: New session 28 of user core. May 27 03:22:03.044072 systemd[1]: Started session-28.scope - Session 28 of User core. May 27 03:22:03.176506 sshd[5765]: Connection closed by 10.0.0.1 port 34866 May 27 03:22:03.176989 sshd-session[5762]: pam_unix(sshd:session): session closed for user core May 27 03:22:03.182599 systemd[1]: sshd@27-10.0.0.98:22-10.0.0.1:34866.service: Deactivated successfully. May 27 03:22:03.184835 systemd[1]: session-28.scope: Deactivated successfully. May 27 03:22:03.185683 systemd-logind[1549]: Session 28 logged out. Waiting for processes to exit. May 27 03:22:03.187278 systemd-logind[1549]: Removed session 28. May 27 03:22:08.197329 systemd[1]: Started sshd@28-10.0.0.98:22-10.0.0.1:47332.service - OpenSSH per-connection server daemon (10.0.0.1:47332). May 27 03:22:08.260932 sshd[5779]: Accepted publickey for core from 10.0.0.1 port 47332 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:08.262512 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:08.268509 systemd-logind[1549]: New session 29 of user core. May 27 03:22:08.276592 systemd[1]: Started session-29.scope - Session 29 of User core. May 27 03:22:08.397096 sshd[5783]: Connection closed by 10.0.0.1 port 47332 May 27 03:22:08.397427 sshd-session[5779]: pam_unix(sshd:session): session closed for user core May 27 03:22:08.401880 systemd[1]: sshd@28-10.0.0.98:22-10.0.0.1:47332.service: Deactivated successfully. May 27 03:22:08.404102 systemd[1]: session-29.scope: Deactivated successfully. May 27 03:22:08.404924 systemd-logind[1549]: Session 29 logged out. Waiting for processes to exit. May 27 03:22:08.406256 systemd-logind[1549]: Removed session 29. May 27 03:22:13.410041 systemd[1]: Started sshd@29-10.0.0.98:22-10.0.0.1:47334.service - OpenSSH per-connection server daemon (10.0.0.1:47334). May 27 03:22:13.474287 sshd[5796]: Accepted publickey for core from 10.0.0.1 port 47334 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:13.475883 sshd-session[5796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:13.480875 systemd-logind[1549]: New session 30 of user core. May 27 03:22:13.496569 systemd[1]: Started session-30.scope - Session 30 of User core. May 27 03:22:13.608489 sshd[5798]: Connection closed by 10.0.0.1 port 47334 May 27 03:22:13.608831 sshd-session[5796]: pam_unix(sshd:session): session closed for user core May 27 03:22:13.612270 systemd[1]: sshd@29-10.0.0.98:22-10.0.0.1:47334.service: Deactivated successfully. May 27 03:22:13.614427 systemd[1]: session-30.scope: Deactivated successfully. May 27 03:22:13.616734 systemd-logind[1549]: Session 30 logged out. Waiting for processes to exit. May 27 03:22:13.617794 systemd-logind[1549]: Removed session 30. May 27 03:22:13.631257 containerd[1564]: time="2025-05-27T03:22:13.631205883Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:22:13.931014 containerd[1564]: time="2025-05-27T03:22:13.930961480Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:22:13.932128 containerd[1564]: time="2025-05-27T03:22:13.932090937Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:22:13.932223 containerd[1564]: time="2025-05-27T03:22:13.932140401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:22:13.932427 kubelet[2684]: E0527 03:22:13.932362 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:22:13.932824 kubelet[2684]: E0527 03:22:13.932459 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:22:13.932824 kubelet[2684]: E0527 03:22:13.932624 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sf2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:22:13.933871 kubelet[2684]: E0527 03:22:13.933820 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:22:14.632948 kubelet[2684]: E0527 03:22:14.632628 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:22:18.622102 systemd[1]: Started sshd@30-10.0.0.98:22-10.0.0.1:44708.service - OpenSSH per-connection server daemon (10.0.0.1:44708). May 27 03:22:18.674057 sshd[5812]: Accepted publickey for core from 10.0.0.1 port 44708 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:18.675632 sshd-session[5812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:18.680502 systemd-logind[1549]: New session 31 of user core. May 27 03:22:18.687596 systemd[1]: Started session-31.scope - Session 31 of User core. May 27 03:22:18.801143 sshd[5814]: Connection closed by 10.0.0.1 port 44708 May 27 03:22:18.801500 sshd-session[5812]: pam_unix(sshd:session): session closed for user core May 27 03:22:18.805900 systemd[1]: sshd@30-10.0.0.98:22-10.0.0.1:44708.service: Deactivated successfully. May 27 03:22:18.807981 systemd[1]: session-31.scope: Deactivated successfully. May 27 03:22:18.808706 systemd-logind[1549]: Session 31 logged out. Waiting for processes to exit. May 27 03:22:18.809972 systemd-logind[1549]: Removed session 31. May 27 03:22:18.940065 containerd[1564]: time="2025-05-27T03:22:18.940015126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"824d8f8ee9cb92a7a4bf7f2164638fc8923b6978c4f0c5d52ce21cafe10bd8bd\" pid:5839 exited_at:{seconds:1748316138 nanos:939798658}" May 27 03:22:23.826854 systemd[1]: Started sshd@31-10.0.0.98:22-10.0.0.1:54850.service - OpenSSH per-connection server daemon (10.0.0.1:54850). May 27 03:22:23.879464 sshd[5852]: Accepted publickey for core from 10.0.0.1 port 54850 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:23.881195 sshd-session[5852]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:23.885665 systemd-logind[1549]: New session 32 of user core. May 27 03:22:23.895563 systemd[1]: Started session-32.scope - Session 32 of User core. May 27 03:22:24.004904 sshd[5854]: Connection closed by 10.0.0.1 port 54850 May 27 03:22:24.005213 sshd-session[5852]: pam_unix(sshd:session): session closed for user core May 27 03:22:24.009344 systemd[1]: sshd@31-10.0.0.98:22-10.0.0.1:54850.service: Deactivated successfully. May 27 03:22:24.011281 systemd[1]: session-32.scope: Deactivated successfully. May 27 03:22:24.012061 systemd-logind[1549]: Session 32 logged out. Waiting for processes to exit. May 27 03:22:24.013337 systemd-logind[1549]: Removed session 32. May 27 03:22:25.809302 containerd[1564]: time="2025-05-27T03:22:25.809250511Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"2163a7707b0f8d1596a19e46496e6352eea680f365a4c67e39c3091512c9eca9\" pid:5878 exited_at:{seconds:1748316145 nanos:809041979}" May 27 03:22:28.630613 kubelet[2684]: E0527 03:22:28.630513 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:22:28.904081 containerd[1564]: time="2025-05-27T03:22:28.903892475Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"9e773d4078db9aebc6ceda3d95a72118001954e28ca3768b8df055830233fdb5\" pid:5899 exited_at:{seconds:1748316148 nanos:903306102}" May 27 03:22:29.021595 systemd[1]: Started sshd@32-10.0.0.98:22-10.0.0.1:54866.service - OpenSSH per-connection server daemon (10.0.0.1:54866). May 27 03:22:29.062654 sshd[5912]: Accepted publickey for core from 10.0.0.1 port 54866 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:29.064139 sshd-session[5912]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:29.068491 systemd-logind[1549]: New session 33 of user core. May 27 03:22:29.078601 systemd[1]: Started session-33.scope - Session 33 of User core. May 27 03:22:29.189339 sshd[5914]: Connection closed by 10.0.0.1 port 54866 May 27 03:22:29.189594 sshd-session[5912]: pam_unix(sshd:session): session closed for user core May 27 03:22:29.194521 systemd[1]: sshd@32-10.0.0.98:22-10.0.0.1:54866.service: Deactivated successfully. May 27 03:22:29.196876 systemd[1]: session-33.scope: Deactivated successfully. May 27 03:22:29.197885 systemd-logind[1549]: Session 33 logged out. Waiting for processes to exit. May 27 03:22:29.199524 systemd-logind[1549]: Removed session 33. May 27 03:22:29.631722 kubelet[2684]: E0527 03:22:29.631414 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:22:34.206612 systemd[1]: Started sshd@33-10.0.0.98:22-10.0.0.1:55702.service - OpenSSH per-connection server daemon (10.0.0.1:55702). May 27 03:22:34.248195 sshd[5927]: Accepted publickey for core from 10.0.0.1 port 55702 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:34.249712 sshd-session[5927]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:34.254230 systemd-logind[1549]: New session 34 of user core. May 27 03:22:34.264582 systemd[1]: Started session-34.scope - Session 34 of User core. May 27 03:22:34.377728 sshd[5929]: Connection closed by 10.0.0.1 port 55702 May 27 03:22:34.378070 sshd-session[5927]: pam_unix(sshd:session): session closed for user core May 27 03:22:34.382595 systemd[1]: sshd@33-10.0.0.98:22-10.0.0.1:55702.service: Deactivated successfully. May 27 03:22:34.384840 systemd[1]: session-34.scope: Deactivated successfully. May 27 03:22:34.385689 systemd-logind[1549]: Session 34 logged out. Waiting for processes to exit. May 27 03:22:34.387055 systemd-logind[1549]: Removed session 34. May 27 03:22:39.391545 systemd[1]: Started sshd@34-10.0.0.98:22-10.0.0.1:55716.service - OpenSSH per-connection server daemon (10.0.0.1:55716). May 27 03:22:39.446084 sshd[5942]: Accepted publickey for core from 10.0.0.1 port 55716 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:39.448011 sshd-session[5942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:39.452462 systemd-logind[1549]: New session 35 of user core. May 27 03:22:39.460572 systemd[1]: Started session-35.scope - Session 35 of User core. May 27 03:22:39.573318 sshd[5944]: Connection closed by 10.0.0.1 port 55716 May 27 03:22:39.573733 sshd-session[5942]: pam_unix(sshd:session): session closed for user core May 27 03:22:39.578346 systemd[1]: sshd@34-10.0.0.98:22-10.0.0.1:55716.service: Deactivated successfully. May 27 03:22:39.580607 systemd[1]: session-35.scope: Deactivated successfully. May 27 03:22:39.581474 systemd-logind[1549]: Session 35 logged out. Waiting for processes to exit. May 27 03:22:39.582908 systemd-logind[1549]: Removed session 35. May 27 03:22:40.630801 kubelet[2684]: E0527 03:22:40.630715 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:22:42.630754 kubelet[2684]: E0527 03:22:42.630551 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:22:44.591582 systemd[1]: Started sshd@35-10.0.0.98:22-10.0.0.1:60250.service - OpenSSH per-connection server daemon (10.0.0.1:60250). May 27 03:22:44.647006 sshd[5958]: Accepted publickey for core from 10.0.0.1 port 60250 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:44.648802 sshd-session[5958]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:44.653798 systemd-logind[1549]: New session 36 of user core. May 27 03:22:44.662580 systemd[1]: Started session-36.scope - Session 36 of User core. May 27 03:22:44.775162 sshd[5962]: Connection closed by 10.0.0.1 port 60250 May 27 03:22:44.775566 sshd-session[5958]: pam_unix(sshd:session): session closed for user core May 27 03:22:44.780231 systemd[1]: sshd@35-10.0.0.98:22-10.0.0.1:60250.service: Deactivated successfully. May 27 03:22:44.782599 systemd[1]: session-36.scope: Deactivated successfully. May 27 03:22:44.783653 systemd-logind[1549]: Session 36 logged out. Waiting for processes to exit. May 27 03:22:44.785106 systemd-logind[1549]: Removed session 36. May 27 03:22:48.937811 containerd[1564]: time="2025-05-27T03:22:48.937761139Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"16b4b373a0f2cc52d92161cba3476fc6fb29db5004d10b86a9f41121872099a3\" pid:5986 exited_at:{seconds:1748316168 nanos:937538519}" May 27 03:22:49.793509 systemd[1]: Started sshd@36-10.0.0.98:22-10.0.0.1:60256.service - OpenSSH per-connection server daemon (10.0.0.1:60256). May 27 03:22:49.850648 sshd[5999]: Accepted publickey for core from 10.0.0.1 port 60256 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:49.852045 sshd-session[5999]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:49.856676 systemd-logind[1549]: New session 37 of user core. May 27 03:22:49.868555 systemd[1]: Started session-37.scope - Session 37 of User core. May 27 03:22:49.976022 sshd[6001]: Connection closed by 10.0.0.1 port 60256 May 27 03:22:49.976340 sshd-session[5999]: pam_unix(sshd:session): session closed for user core May 27 03:22:49.981094 systemd[1]: sshd@36-10.0.0.98:22-10.0.0.1:60256.service: Deactivated successfully. May 27 03:22:49.983150 systemd[1]: session-37.scope: Deactivated successfully. May 27 03:22:49.984035 systemd-logind[1549]: Session 37 logged out. Waiting for processes to exit. May 27 03:22:49.985354 systemd-logind[1549]: Removed session 37. May 27 03:22:52.635955 kubelet[2684]: E0527 03:22:52.635767 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:22:53.631083 kubelet[2684]: E0527 03:22:53.630970 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:22:54.995099 systemd[1]: Started sshd@37-10.0.0.98:22-10.0.0.1:50796.service - OpenSSH per-connection server daemon (10.0.0.1:50796). May 27 03:22:55.045911 sshd[6014]: Accepted publickey for core from 10.0.0.1 port 50796 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:22:55.047694 sshd-session[6014]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:22:55.052396 systemd-logind[1549]: New session 38 of user core. May 27 03:22:55.059579 systemd[1]: Started session-38.scope - Session 38 of User core. May 27 03:22:55.173693 sshd[6016]: Connection closed by 10.0.0.1 port 50796 May 27 03:22:55.174037 sshd-session[6014]: pam_unix(sshd:session): session closed for user core May 27 03:22:55.178141 systemd[1]: sshd@37-10.0.0.98:22-10.0.0.1:50796.service: Deactivated successfully. May 27 03:22:55.180221 systemd[1]: session-38.scope: Deactivated successfully. May 27 03:22:55.181198 systemd-logind[1549]: Session 38 logged out. Waiting for processes to exit. May 27 03:22:55.182531 systemd-logind[1549]: Removed session 38. May 27 03:22:58.915225 containerd[1564]: time="2025-05-27T03:22:58.915136271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"0cb360745d60fcf3eb2b66461ba891b693161dd3d8c8454cf3c0beb25447aa4a\" pid:6039 exited_at:{seconds:1748316178 nanos:914733069}" May 27 03:23:00.186769 systemd[1]: Started sshd@38-10.0.0.98:22-10.0.0.1:50798.service - OpenSSH per-connection server daemon (10.0.0.1:50798). May 27 03:23:00.245930 sshd[6053]: Accepted publickey for core from 10.0.0.1 port 50798 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:00.247652 sshd-session[6053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:00.252401 systemd-logind[1549]: New session 39 of user core. May 27 03:23:00.266596 systemd[1]: Started session-39.scope - Session 39 of User core. May 27 03:23:00.381369 sshd[6055]: Connection closed by 10.0.0.1 port 50798 May 27 03:23:00.381699 sshd-session[6053]: pam_unix(sshd:session): session closed for user core May 27 03:23:00.386661 systemd[1]: sshd@38-10.0.0.98:22-10.0.0.1:50798.service: Deactivated successfully. May 27 03:23:00.388904 systemd[1]: session-39.scope: Deactivated successfully. May 27 03:23:00.389687 systemd-logind[1549]: Session 39 logged out. Waiting for processes to exit. May 27 03:23:00.391002 systemd-logind[1549]: Removed session 39. May 27 03:23:05.395632 systemd[1]: Started sshd@39-10.0.0.98:22-10.0.0.1:36944.service - OpenSSH per-connection server daemon (10.0.0.1:36944). May 27 03:23:05.450330 sshd[6069]: Accepted publickey for core from 10.0.0.1 port 36944 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:05.451854 sshd-session[6069]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:05.456384 systemd-logind[1549]: New session 40 of user core. May 27 03:23:05.470583 systemd[1]: Started session-40.scope - Session 40 of User core. May 27 03:23:05.587691 sshd[6071]: Connection closed by 10.0.0.1 port 36944 May 27 03:23:05.588028 sshd-session[6069]: pam_unix(sshd:session): session closed for user core May 27 03:23:05.592914 systemd[1]: sshd@39-10.0.0.98:22-10.0.0.1:36944.service: Deactivated successfully. May 27 03:23:05.596588 systemd[1]: session-40.scope: Deactivated successfully. May 27 03:23:05.597668 systemd-logind[1549]: Session 40 logged out. Waiting for processes to exit. May 27 03:23:05.599033 systemd-logind[1549]: Removed session 40. May 27 03:23:06.631396 kubelet[2684]: E0527 03:23:06.631273 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:23:08.630053 kubelet[2684]: E0527 03:23:08.629990 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:23:10.605859 systemd[1]: Started sshd@40-10.0.0.98:22-10.0.0.1:36948.service - OpenSSH per-connection server daemon (10.0.0.1:36948). May 27 03:23:10.655928 sshd[6090]: Accepted publickey for core from 10.0.0.1 port 36948 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:10.657425 sshd-session[6090]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:10.662058 systemd-logind[1549]: New session 41 of user core. May 27 03:23:10.670592 systemd[1]: Started session-41.scope - Session 41 of User core. May 27 03:23:10.782457 sshd[6092]: Connection closed by 10.0.0.1 port 36948 May 27 03:23:10.782816 sshd-session[6090]: pam_unix(sshd:session): session closed for user core May 27 03:23:10.786177 systemd[1]: sshd@40-10.0.0.98:22-10.0.0.1:36948.service: Deactivated successfully. May 27 03:23:10.788428 systemd[1]: session-41.scope: Deactivated successfully. May 27 03:23:10.789320 systemd-logind[1549]: Session 41 logged out. Waiting for processes to exit. May 27 03:23:10.791570 systemd-logind[1549]: Removed session 41. May 27 03:23:15.799745 systemd[1]: Started sshd@41-10.0.0.98:22-10.0.0.1:45496.service - OpenSSH per-connection server daemon (10.0.0.1:45496). May 27 03:23:15.849328 sshd[6105]: Accepted publickey for core from 10.0.0.1 port 45496 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:15.850691 sshd-session[6105]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:15.855245 systemd-logind[1549]: New session 42 of user core. May 27 03:23:15.865567 systemd[1]: Started session-42.scope - Session 42 of User core. May 27 03:23:15.974527 sshd[6107]: Connection closed by 10.0.0.1 port 45496 May 27 03:23:15.974840 sshd-session[6105]: pam_unix(sshd:session): session closed for user core May 27 03:23:15.979214 systemd[1]: sshd@41-10.0.0.98:22-10.0.0.1:45496.service: Deactivated successfully. May 27 03:23:15.981558 systemd[1]: session-42.scope: Deactivated successfully. May 27 03:23:15.982317 systemd-logind[1549]: Session 42 logged out. Waiting for processes to exit. May 27 03:23:15.983668 systemd-logind[1549]: Removed session 42. May 27 03:23:18.957649 containerd[1564]: time="2025-05-27T03:23:18.957565706Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"c2d551de160f3d14a708562101a956205efe2bef0e98b7b4996a5e329366506b\" pid:6131 exited_at:{seconds:1748316198 nanos:957189168}" May 27 03:23:19.631393 kubelet[2684]: E0527 03:23:19.631342 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:23:20.630823 kubelet[2684]: E0527 03:23:20.630759 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:23:20.991333 systemd[1]: Started sshd@42-10.0.0.98:22-10.0.0.1:45498.service - OpenSSH per-connection server daemon (10.0.0.1:45498). May 27 03:23:21.050087 sshd[6145]: Accepted publickey for core from 10.0.0.1 port 45498 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:21.051677 sshd-session[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:21.056387 systemd-logind[1549]: New session 43 of user core. May 27 03:23:21.064596 systemd[1]: Started session-43.scope - Session 43 of User core. May 27 03:23:21.177950 sshd[6149]: Connection closed by 10.0.0.1 port 45498 May 27 03:23:21.178380 sshd-session[6145]: pam_unix(sshd:session): session closed for user core May 27 03:23:21.183326 systemd[1]: sshd@42-10.0.0.98:22-10.0.0.1:45498.service: Deactivated successfully. May 27 03:23:21.185737 systemd[1]: session-43.scope: Deactivated successfully. May 27 03:23:21.186824 systemd-logind[1549]: Session 43 logged out. Waiting for processes to exit. May 27 03:23:21.188254 systemd-logind[1549]: Removed session 43. May 27 03:23:25.805394 containerd[1564]: time="2025-05-27T03:23:25.805348974Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"82f7517ee3236a94a52588d210c32d7ea3b49073d140f061f92e4f584478e358\" pid:6174 exited_at:{seconds:1748316205 nanos:805167178}" May 27 03:23:26.194150 systemd[1]: Started sshd@43-10.0.0.98:22-10.0.0.1:58176.service - OpenSSH per-connection server daemon (10.0.0.1:58176). May 27 03:23:26.240818 sshd[6185]: Accepted publickey for core from 10.0.0.1 port 58176 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:26.242377 sshd-session[6185]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:26.246769 systemd-logind[1549]: New session 44 of user core. May 27 03:23:26.257611 systemd[1]: Started session-44.scope - Session 44 of User core. May 27 03:23:26.374222 sshd[6187]: Connection closed by 10.0.0.1 port 58176 May 27 03:23:26.374576 sshd-session[6185]: pam_unix(sshd:session): session closed for user core May 27 03:23:26.378929 systemd[1]: sshd@43-10.0.0.98:22-10.0.0.1:58176.service: Deactivated successfully. May 27 03:23:26.381087 systemd[1]: session-44.scope: Deactivated successfully. May 27 03:23:26.381925 systemd-logind[1549]: Session 44 logged out. Waiting for processes to exit. May 27 03:23:26.383163 systemd-logind[1549]: Removed session 44. May 27 03:23:28.908514 containerd[1564]: time="2025-05-27T03:23:28.908414656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"705f60e40c45b9b3d7a267161c2814f79c37294657511a80c3ef8244cd2bdcf1\" pid:6211 exited_at:{seconds:1748316208 nanos:908056244}" May 27 03:23:31.394902 systemd[1]: Started sshd@44-10.0.0.98:22-10.0.0.1:58192.service - OpenSSH per-connection server daemon (10.0.0.1:58192). May 27 03:23:31.452312 sshd[6224]: Accepted publickey for core from 10.0.0.1 port 58192 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:31.453871 sshd-session[6224]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:31.458355 systemd-logind[1549]: New session 45 of user core. May 27 03:23:31.466599 systemd[1]: Started session-45.scope - Session 45 of User core. May 27 03:23:31.578415 sshd[6226]: Connection closed by 10.0.0.1 port 58192 May 27 03:23:31.578805 sshd-session[6224]: pam_unix(sshd:session): session closed for user core May 27 03:23:31.583602 systemd[1]: sshd@44-10.0.0.98:22-10.0.0.1:58192.service: Deactivated successfully. May 27 03:23:31.585658 systemd[1]: session-45.scope: Deactivated successfully. May 27 03:23:31.586595 systemd-logind[1549]: Session 45 logged out. Waiting for processes to exit. May 27 03:23:31.587972 systemd-logind[1549]: Removed session 45. May 27 03:23:31.631192 containerd[1564]: time="2025-05-27T03:23:31.631135244Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:23:31.883392 containerd[1564]: time="2025-05-27T03:23:31.883321371Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:23:31.884650 containerd[1564]: time="2025-05-27T03:23:31.884573633Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:23:31.884650 containerd[1564]: time="2025-05-27T03:23:31.884617498Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:23:31.884932 kubelet[2684]: E0527 03:23:31.884859 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:23:31.885411 kubelet[2684]: E0527 03:23:31.884943 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:23:31.885411 kubelet[2684]: E0527 03:23:31.885102 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:767452f2f0924610a3259c1e7259b93b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:23:31.887193 containerd[1564]: time="2025-05-27T03:23:31.887147563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:23:32.138515 containerd[1564]: time="2025-05-27T03:23:32.138294052Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:23:32.139637 containerd[1564]: time="2025-05-27T03:23:32.139574509Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:23:32.139736 containerd[1564]: time="2025-05-27T03:23:32.139636387Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:23:32.139898 kubelet[2684]: E0527 03:23:32.139836 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:23:32.139998 kubelet[2684]: E0527 03:23:32.139900 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:23:32.140060 kubelet[2684]: E0527 03:23:32.140016 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:23:32.141240 kubelet[2684]: E0527 03:23:32.141182 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:23:34.633197 containerd[1564]: time="2025-05-27T03:23:34.633133787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:23:34.865115 containerd[1564]: time="2025-05-27T03:23:34.865057298Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:23:34.866104 containerd[1564]: time="2025-05-27T03:23:34.866071719Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:23:34.866187 containerd[1564]: time="2025-05-27T03:23:34.866139427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:23:34.866344 kubelet[2684]: E0527 03:23:34.866283 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:23:34.866344 kubelet[2684]: E0527 03:23:34.866334 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:23:34.866955 kubelet[2684]: E0527 03:23:34.866501 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sf2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:23:34.867772 kubelet[2684]: E0527 03:23:34.867702 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:23:36.599646 systemd[1]: Started sshd@45-10.0.0.98:22-10.0.0.1:41600.service - OpenSSH per-connection server daemon (10.0.0.1:41600). May 27 03:23:36.649108 sshd[6246]: Accepted publickey for core from 10.0.0.1 port 41600 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:36.650872 sshd-session[6246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:36.655242 systemd-logind[1549]: New session 46 of user core. May 27 03:23:36.660570 systemd[1]: Started session-46.scope - Session 46 of User core. May 27 03:23:36.770348 sshd[6248]: Connection closed by 10.0.0.1 port 41600 May 27 03:23:36.770701 sshd-session[6246]: pam_unix(sshd:session): session closed for user core May 27 03:23:36.775496 systemd[1]: sshd@45-10.0.0.98:22-10.0.0.1:41600.service: Deactivated successfully. May 27 03:23:36.777798 systemd[1]: session-46.scope: Deactivated successfully. May 27 03:23:36.778662 systemd-logind[1549]: Session 46 logged out. Waiting for processes to exit. May 27 03:23:36.779986 systemd-logind[1549]: Removed session 46. May 27 03:23:41.787464 systemd[1]: Started sshd@46-10.0.0.98:22-10.0.0.1:41602.service - OpenSSH per-connection server daemon (10.0.0.1:41602). May 27 03:23:41.839145 sshd[6274]: Accepted publickey for core from 10.0.0.1 port 41602 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:41.840808 sshd-session[6274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:41.845160 systemd-logind[1549]: New session 47 of user core. May 27 03:23:41.853573 systemd[1]: Started session-47.scope - Session 47 of User core. May 27 03:23:41.966955 sshd[6276]: Connection closed by 10.0.0.1 port 41602 May 27 03:23:41.967294 sshd-session[6274]: pam_unix(sshd:session): session closed for user core May 27 03:23:41.971569 systemd[1]: sshd@46-10.0.0.98:22-10.0.0.1:41602.service: Deactivated successfully. May 27 03:23:41.974070 systemd[1]: session-47.scope: Deactivated successfully. May 27 03:23:41.974889 systemd-logind[1549]: Session 47 logged out. Waiting for processes to exit. May 27 03:23:41.976162 systemd-logind[1549]: Removed session 47. May 27 03:23:46.984801 systemd[1]: Started sshd@47-10.0.0.98:22-10.0.0.1:42836.service - OpenSSH per-connection server daemon (10.0.0.1:42836). May 27 03:23:47.041524 sshd[6291]: Accepted publickey for core from 10.0.0.1 port 42836 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:47.043534 sshd-session[6291]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:47.048294 systemd-logind[1549]: New session 48 of user core. May 27 03:23:47.062623 systemd[1]: Started session-48.scope - Session 48 of User core. May 27 03:23:47.182716 sshd[6293]: Connection closed by 10.0.0.1 port 42836 May 27 03:23:47.183129 sshd-session[6291]: pam_unix(sshd:session): session closed for user core May 27 03:23:47.192677 systemd[1]: sshd@47-10.0.0.98:22-10.0.0.1:42836.service: Deactivated successfully. May 27 03:23:47.195010 systemd[1]: session-48.scope: Deactivated successfully. May 27 03:23:47.195854 systemd-logind[1549]: Session 48 logged out. Waiting for processes to exit. May 27 03:23:47.200099 systemd[1]: Started sshd@48-10.0.0.98:22-10.0.0.1:42852.service - OpenSSH per-connection server daemon (10.0.0.1:42852). May 27 03:23:47.201123 systemd-logind[1549]: Removed session 48. May 27 03:23:47.252010 sshd[6306]: Accepted publickey for core from 10.0.0.1 port 42852 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:47.253392 sshd-session[6306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:47.258512 systemd-logind[1549]: New session 49 of user core. May 27 03:23:47.273655 systemd[1]: Started session-49.scope - Session 49 of User core. May 27 03:23:47.592423 sshd[6308]: Connection closed by 10.0.0.1 port 42852 May 27 03:23:47.593216 sshd-session[6306]: pam_unix(sshd:session): session closed for user core May 27 03:23:47.611709 systemd[1]: sshd@48-10.0.0.98:22-10.0.0.1:42852.service: Deactivated successfully. May 27 03:23:47.613968 systemd[1]: session-49.scope: Deactivated successfully. May 27 03:23:47.614990 systemd-logind[1549]: Session 49 logged out. Waiting for processes to exit. May 27 03:23:47.618719 systemd[1]: Started sshd@49-10.0.0.98:22-10.0.0.1:42854.service - OpenSSH per-connection server daemon (10.0.0.1:42854). May 27 03:23:47.619957 systemd-logind[1549]: Removed session 49. May 27 03:23:47.631577 kubelet[2684]: E0527 03:23:47.631511 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:23:47.632357 kubelet[2684]: E0527 03:23:47.632055 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:23:47.679268 sshd[6320]: Accepted publickey for core from 10.0.0.1 port 42854 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:47.681286 sshd-session[6320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:47.686334 systemd-logind[1549]: New session 50 of user core. May 27 03:23:47.696661 systemd[1]: Started session-50.scope - Session 50 of User core. May 27 03:23:48.610623 sshd[6322]: Connection closed by 10.0.0.1 port 42854 May 27 03:23:48.611075 sshd-session[6320]: pam_unix(sshd:session): session closed for user core May 27 03:23:48.622044 systemd[1]: sshd@49-10.0.0.98:22-10.0.0.1:42854.service: Deactivated successfully. May 27 03:23:48.626091 systemd[1]: session-50.scope: Deactivated successfully. May 27 03:23:48.627304 systemd-logind[1549]: Session 50 logged out. Waiting for processes to exit. May 27 03:23:48.631069 systemd-logind[1549]: Removed session 50. May 27 03:23:48.633375 systemd[1]: Started sshd@50-10.0.0.98:22-10.0.0.1:42864.service - OpenSSH per-connection server daemon (10.0.0.1:42864). May 27 03:23:48.695124 sshd[6341]: Accepted publickey for core from 10.0.0.1 port 42864 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:48.696995 sshd-session[6341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:48.701732 systemd-logind[1549]: New session 51 of user core. May 27 03:23:48.711575 systemd[1]: Started session-51.scope - Session 51 of User core. May 27 03:23:48.951643 containerd[1564]: time="2025-05-27T03:23:48.951573331Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"87c39cadd37f1788389ee63277f2edc564efba3b10d51aa3825722a18aa9b3d0\" pid:6361 exited_at:{seconds:1748316228 nanos:951223807}" May 27 03:23:48.993813 sshd[6343]: Connection closed by 10.0.0.1 port 42864 May 27 03:23:48.994153 sshd-session[6341]: pam_unix(sshd:session): session closed for user core May 27 03:23:49.003512 systemd[1]: sshd@50-10.0.0.98:22-10.0.0.1:42864.service: Deactivated successfully. May 27 03:23:49.005714 systemd[1]: session-51.scope: Deactivated successfully. May 27 03:23:49.006918 systemd-logind[1549]: Session 51 logged out. Waiting for processes to exit. May 27 03:23:49.011094 systemd[1]: Started sshd@51-10.0.0.98:22-10.0.0.1:42872.service - OpenSSH per-connection server daemon (10.0.0.1:42872). May 27 03:23:49.011931 systemd-logind[1549]: Removed session 51. May 27 03:23:49.067249 sshd[6374]: Accepted publickey for core from 10.0.0.1 port 42872 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:49.068847 sshd-session[6374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:49.073526 systemd-logind[1549]: New session 52 of user core. May 27 03:23:49.083646 systemd[1]: Started session-52.scope - Session 52 of User core. May 27 03:23:49.198762 sshd[6376]: Connection closed by 10.0.0.1 port 42872 May 27 03:23:49.199058 sshd-session[6374]: pam_unix(sshd:session): session closed for user core May 27 03:23:49.203897 systemd[1]: sshd@51-10.0.0.98:22-10.0.0.1:42872.service: Deactivated successfully. May 27 03:23:49.206226 systemd[1]: session-52.scope: Deactivated successfully. May 27 03:23:49.207041 systemd-logind[1549]: Session 52 logged out. Waiting for processes to exit. May 27 03:23:49.208341 systemd-logind[1549]: Removed session 52. May 27 03:23:54.223540 systemd[1]: Started sshd@52-10.0.0.98:22-10.0.0.1:33482.service - OpenSSH per-connection server daemon (10.0.0.1:33482). May 27 03:23:54.270131 sshd[6391]: Accepted publickey for core from 10.0.0.1 port 33482 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:54.271663 sshd-session[6391]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:54.276552 systemd-logind[1549]: New session 53 of user core. May 27 03:23:54.286596 systemd[1]: Started session-53.scope - Session 53 of User core. May 27 03:23:54.400175 sshd[6393]: Connection closed by 10.0.0.1 port 33482 May 27 03:23:54.400530 sshd-session[6391]: pam_unix(sshd:session): session closed for user core May 27 03:23:54.405475 systemd[1]: sshd@52-10.0.0.98:22-10.0.0.1:33482.service: Deactivated successfully. May 27 03:23:54.408167 systemd[1]: session-53.scope: Deactivated successfully. May 27 03:23:54.409194 systemd-logind[1549]: Session 53 logged out. Waiting for processes to exit. May 27 03:23:54.410628 systemd-logind[1549]: Removed session 53. May 27 03:23:58.924470 containerd[1564]: time="2025-05-27T03:23:58.924365188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"54f34828cc14451128f7ae6149aead04379414531a3aa77d038e53855caf0cbb\" pid:6417 exited_at:{seconds:1748316238 nanos:923350673}" May 27 03:23:59.423090 systemd[1]: Started sshd@53-10.0.0.98:22-10.0.0.1:33494.service - OpenSSH per-connection server daemon (10.0.0.1:33494). May 27 03:23:59.479310 sshd[6430]: Accepted publickey for core from 10.0.0.1 port 33494 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:23:59.480969 sshd-session[6430]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:59.486099 systemd-logind[1549]: New session 54 of user core. May 27 03:23:59.497626 systemd[1]: Started session-54.scope - Session 54 of User core. May 27 03:23:59.608574 sshd[6432]: Connection closed by 10.0.0.1 port 33494 May 27 03:23:59.608927 sshd-session[6430]: pam_unix(sshd:session): session closed for user core May 27 03:23:59.613789 systemd[1]: sshd@53-10.0.0.98:22-10.0.0.1:33494.service: Deactivated successfully. May 27 03:23:59.616060 systemd[1]: session-54.scope: Deactivated successfully. May 27 03:23:59.617160 systemd-logind[1549]: Session 54 logged out. Waiting for processes to exit. May 27 03:23:59.618517 systemd-logind[1549]: Removed session 54. May 27 03:23:59.632145 kubelet[2684]: E0527 03:23:59.632043 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:24:00.630918 kubelet[2684]: E0527 03:24:00.630858 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:24:04.622208 systemd[1]: Started sshd@54-10.0.0.98:22-10.0.0.1:58660.service - OpenSSH per-connection server daemon (10.0.0.1:58660). May 27 03:24:04.675934 sshd[6446]: Accepted publickey for core from 10.0.0.1 port 58660 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:04.677760 sshd-session[6446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:04.681964 systemd-logind[1549]: New session 55 of user core. May 27 03:24:04.692549 systemd[1]: Started session-55.scope - Session 55 of User core. May 27 03:24:04.801326 sshd[6448]: Connection closed by 10.0.0.1 port 58660 May 27 03:24:04.801663 sshd-session[6446]: pam_unix(sshd:session): session closed for user core May 27 03:24:04.805892 systemd[1]: sshd@54-10.0.0.98:22-10.0.0.1:58660.service: Deactivated successfully. May 27 03:24:04.808044 systemd[1]: session-55.scope: Deactivated successfully. May 27 03:24:04.808778 systemd-logind[1549]: Session 55 logged out. Waiting for processes to exit. May 27 03:24:04.810076 systemd-logind[1549]: Removed session 55. May 27 03:24:09.814782 systemd[1]: Started sshd@55-10.0.0.98:22-10.0.0.1:58666.service - OpenSSH per-connection server daemon (10.0.0.1:58666). May 27 03:24:09.872223 sshd[6462]: Accepted publickey for core from 10.0.0.1 port 58666 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:09.874014 sshd-session[6462]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:09.878596 systemd-logind[1549]: New session 56 of user core. May 27 03:24:09.887561 systemd[1]: Started session-56.scope - Session 56 of User core. May 27 03:24:10.002334 sshd[6464]: Connection closed by 10.0.0.1 port 58666 May 27 03:24:10.002694 sshd-session[6462]: pam_unix(sshd:session): session closed for user core May 27 03:24:10.006740 systemd[1]: sshd@55-10.0.0.98:22-10.0.0.1:58666.service: Deactivated successfully. May 27 03:24:10.009529 systemd[1]: session-56.scope: Deactivated successfully. May 27 03:24:10.010421 systemd-logind[1549]: Session 56 logged out. Waiting for processes to exit. May 27 03:24:10.012935 systemd-logind[1549]: Removed session 56. May 27 03:24:12.631725 kubelet[2684]: E0527 03:24:12.631657 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:24:15.015629 systemd[1]: Started sshd@56-10.0.0.98:22-10.0.0.1:44486.service - OpenSSH per-connection server daemon (10.0.0.1:44486). May 27 03:24:15.071738 sshd[6477]: Accepted publickey for core from 10.0.0.1 port 44486 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:15.073119 sshd-session[6477]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:15.077558 systemd-logind[1549]: New session 57 of user core. May 27 03:24:15.084580 systemd[1]: Started session-57.scope - Session 57 of User core. May 27 03:24:15.197036 sshd[6479]: Connection closed by 10.0.0.1 port 44486 May 27 03:24:15.197337 sshd-session[6477]: pam_unix(sshd:session): session closed for user core May 27 03:24:15.201289 systemd[1]: sshd@56-10.0.0.98:22-10.0.0.1:44486.service: Deactivated successfully. May 27 03:24:15.203412 systemd[1]: session-57.scope: Deactivated successfully. May 27 03:24:15.204419 systemd-logind[1549]: Session 57 logged out. Waiting for processes to exit. May 27 03:24:15.206097 systemd-logind[1549]: Removed session 57. May 27 03:24:15.630052 kubelet[2684]: E0527 03:24:15.629997 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:24:18.940918 containerd[1564]: time="2025-05-27T03:24:18.940869307Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"7f882af448d4af5bf414c2c62c3554584a3836ef6344b041a2a1699795201c8e\" pid:6502 exited_at:{seconds:1748316258 nanos:940607952}" May 27 03:24:20.209353 systemd[1]: Started sshd@57-10.0.0.98:22-10.0.0.1:44498.service - OpenSSH per-connection server daemon (10.0.0.1:44498). May 27 03:24:20.252059 sshd[6515]: Accepted publickey for core from 10.0.0.1 port 44498 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:20.253461 sshd-session[6515]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:20.257750 systemd-logind[1549]: New session 58 of user core. May 27 03:24:20.269570 systemd[1]: Started session-58.scope - Session 58 of User core. May 27 03:24:20.383973 sshd[6517]: Connection closed by 10.0.0.1 port 44498 May 27 03:24:20.384284 sshd-session[6515]: pam_unix(sshd:session): session closed for user core May 27 03:24:20.389125 systemd[1]: sshd@57-10.0.0.98:22-10.0.0.1:44498.service: Deactivated successfully. May 27 03:24:20.391422 systemd[1]: session-58.scope: Deactivated successfully. May 27 03:24:20.392349 systemd-logind[1549]: Session 58 logged out. Waiting for processes to exit. May 27 03:24:20.394053 systemd-logind[1549]: Removed session 58. May 27 03:24:23.631510 kubelet[2684]: E0527 03:24:23.631421 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:24:25.397303 systemd[1]: Started sshd@58-10.0.0.98:22-10.0.0.1:41776.service - OpenSSH per-connection server daemon (10.0.0.1:41776). May 27 03:24:25.456720 sshd[6530]: Accepted publickey for core from 10.0.0.1 port 41776 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:25.458704 sshd-session[6530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:25.463966 systemd-logind[1549]: New session 59 of user core. May 27 03:24:25.468595 systemd[1]: Started session-59.scope - Session 59 of User core. May 27 03:24:25.581130 sshd[6532]: Connection closed by 10.0.0.1 port 41776 May 27 03:24:25.581450 sshd-session[6530]: pam_unix(sshd:session): session closed for user core May 27 03:24:25.586319 systemd[1]: sshd@58-10.0.0.98:22-10.0.0.1:41776.service: Deactivated successfully. May 27 03:24:25.588699 systemd[1]: session-59.scope: Deactivated successfully. May 27 03:24:25.589743 systemd-logind[1549]: Session 59 logged out. Waiting for processes to exit. May 27 03:24:25.591016 systemd-logind[1549]: Removed session 59. May 27 03:24:25.807853 containerd[1564]: time="2025-05-27T03:24:25.807790299Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"d50ce796dc90c04d0030be7d3bb43c531889bda1e7dbb8a92e9072e469dcc923\" pid:6557 exited_at:{seconds:1748316265 nanos:807513234}" May 27 03:24:28.915092 containerd[1564]: time="2025-05-27T03:24:28.915028472Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"ba679fd965dde4a0654afb0853a59a3f4bf99127dedea81287ad618c70bae9a6\" pid:6579 exited_at:{seconds:1748316268 nanos:913888655}" May 27 03:24:30.596699 systemd[1]: Started sshd@59-10.0.0.98:22-10.0.0.1:41790.service - OpenSSH per-connection server daemon (10.0.0.1:41790). May 27 03:24:30.630568 kubelet[2684]: E0527 03:24:30.630500 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:24:30.637386 sshd[6592]: Accepted publickey for core from 10.0.0.1 port 41790 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:30.639085 sshd-session[6592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:30.643814 systemd-logind[1549]: New session 60 of user core. May 27 03:24:30.654585 systemd[1]: Started session-60.scope - Session 60 of User core. May 27 03:24:30.759204 sshd[6594]: Connection closed by 10.0.0.1 port 41790 May 27 03:24:30.759579 sshd-session[6592]: pam_unix(sshd:session): session closed for user core May 27 03:24:30.763802 systemd[1]: sshd@59-10.0.0.98:22-10.0.0.1:41790.service: Deactivated successfully. May 27 03:24:30.766088 systemd[1]: session-60.scope: Deactivated successfully. May 27 03:24:30.767092 systemd-logind[1549]: Session 60 logged out. Waiting for processes to exit. May 27 03:24:30.768427 systemd-logind[1549]: Removed session 60. May 27 03:24:34.632042 kubelet[2684]: E0527 03:24:34.631901 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:24:35.773126 systemd[1]: Started sshd@60-10.0.0.98:22-10.0.0.1:52756.service - OpenSSH per-connection server daemon (10.0.0.1:52756). May 27 03:24:35.829994 sshd[6609]: Accepted publickey for core from 10.0.0.1 port 52756 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:35.832244 sshd-session[6609]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:35.837060 systemd-logind[1549]: New session 61 of user core. May 27 03:24:35.845568 systemd[1]: Started session-61.scope - Session 61 of User core. May 27 03:24:35.957490 sshd[6611]: Connection closed by 10.0.0.1 port 52756 May 27 03:24:35.957949 sshd-session[6609]: pam_unix(sshd:session): session closed for user core May 27 03:24:35.963087 systemd[1]: sshd@60-10.0.0.98:22-10.0.0.1:52756.service: Deactivated successfully. May 27 03:24:35.965402 systemd[1]: session-61.scope: Deactivated successfully. May 27 03:24:35.966538 systemd-logind[1549]: Session 61 logged out. Waiting for processes to exit. May 27 03:24:35.968038 systemd-logind[1549]: Removed session 61. May 27 03:24:37.867241 containerd[1564]: time="2025-05-27T03:24:37.850564063Z" level=warning msg="container event discarded" container=8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e type=CONTAINER_CREATED_EVENT May 27 03:24:37.895490 containerd[1564]: time="2025-05-27T03:24:37.895429489Z" level=warning msg="container event discarded" container=8d1a31899ddfdfd13ac63b9ab79d136a400fac788cc0783e3e26f7fb0b1cab4e type=CONTAINER_STARTED_EVENT May 27 03:24:37.895490 containerd[1564]: time="2025-05-27T03:24:37.895481017Z" level=warning msg="container event discarded" container=6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34 type=CONTAINER_CREATED_EVENT May 27 03:24:37.895579 containerd[1564]: time="2025-05-27T03:24:37.895491797Z" level=warning msg="container event discarded" container=6026a346c2251385f4fa5b38cd2a9d4f83ff7353bd3905635d14147633a1db34 type=CONTAINER_STARTED_EVENT May 27 03:24:37.942853 containerd[1564]: time="2025-05-27T03:24:37.942785586Z" level=warning msg="container event discarded" container=102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989 type=CONTAINER_CREATED_EVENT May 27 03:24:37.942853 containerd[1564]: time="2025-05-27T03:24:37.942819321Z" level=warning msg="container event discarded" container=102995f3bf406f3aeebfd021138906991bbc3306783619e7808e1dfb5f687989 type=CONTAINER_STARTED_EVENT May 27 03:24:38.195346 containerd[1564]: time="2025-05-27T03:24:38.195266908Z" level=warning msg="container event discarded" container=aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618 type=CONTAINER_CREATED_EVENT May 27 03:24:38.261661 containerd[1564]: time="2025-05-27T03:24:38.261563262Z" level=warning msg="container event discarded" container=eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63 type=CONTAINER_CREATED_EVENT May 27 03:24:38.318882 containerd[1564]: time="2025-05-27T03:24:38.318817724Z" level=warning msg="container event discarded" container=aa94b2b037724586b217c58fb6f04acb1c87e93dc95153564fcdb985c2860618 type=CONTAINER_STARTED_EVENT May 27 03:24:38.318882 containerd[1564]: time="2025-05-27T03:24:38.318854684Z" level=warning msg="container event discarded" container=2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e type=CONTAINER_CREATED_EVENT May 27 03:24:38.387908 containerd[1564]: time="2025-05-27T03:24:38.387835294Z" level=warning msg="container event discarded" container=eec8832864994fc4e72a6c0ca9ed404bda01593bb861e203a378eea0e8ee4b63 type=CONTAINER_STARTED_EVENT May 27 03:24:38.460196 containerd[1564]: time="2025-05-27T03:24:38.460066582Z" level=warning msg="container event discarded" container=2a5895bcc8444dd6ba3805c73ea262939ad530d09b40c068eefd1e9d7d18086e type=CONTAINER_STARTED_EVENT May 27 03:24:40.970715 systemd[1]: Started sshd@61-10.0.0.98:22-10.0.0.1:52760.service - OpenSSH per-connection server daemon (10.0.0.1:52760). May 27 03:24:41.021247 sshd[6627]: Accepted publickey for core from 10.0.0.1 port 52760 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:41.022995 sshd-session[6627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:41.027867 systemd-logind[1549]: New session 62 of user core. May 27 03:24:41.037670 systemd[1]: Started session-62.scope - Session 62 of User core. May 27 03:24:41.147196 sshd[6629]: Connection closed by 10.0.0.1 port 52760 May 27 03:24:41.147588 sshd-session[6627]: pam_unix(sshd:session): session closed for user core May 27 03:24:41.152696 systemd[1]: sshd@61-10.0.0.98:22-10.0.0.1:52760.service: Deactivated successfully. May 27 03:24:41.154975 systemd[1]: session-62.scope: Deactivated successfully. May 27 03:24:41.156032 systemd-logind[1549]: Session 62 logged out. Waiting for processes to exit. May 27 03:24:41.157572 systemd-logind[1549]: Removed session 62. May 27 03:24:42.630842 kubelet[2684]: E0527 03:24:42.630789 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:24:46.163818 systemd[1]: Started sshd@62-10.0.0.98:22-10.0.0.1:56994.service - OpenSSH per-connection server daemon (10.0.0.1:56994). May 27 03:24:46.221494 sshd[6645]: Accepted publickey for core from 10.0.0.1 port 56994 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:46.223160 sshd-session[6645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:46.229133 systemd-logind[1549]: New session 63 of user core. May 27 03:24:46.247654 systemd[1]: Started session-63.scope - Session 63 of User core. May 27 03:24:46.362592 sshd[6647]: Connection closed by 10.0.0.1 port 56994 May 27 03:24:46.362947 sshd-session[6645]: pam_unix(sshd:session): session closed for user core May 27 03:24:46.367809 systemd[1]: sshd@62-10.0.0.98:22-10.0.0.1:56994.service: Deactivated successfully. May 27 03:24:46.370761 systemd[1]: session-63.scope: Deactivated successfully. May 27 03:24:46.371546 systemd-logind[1549]: Session 63 logged out. Waiting for processes to exit. May 27 03:24:46.373034 systemd-logind[1549]: Removed session 63. May 27 03:24:48.633396 kubelet[2684]: E0527 03:24:48.633327 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:24:48.939458 containerd[1564]: time="2025-05-27T03:24:48.939339876Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"94ff7b196f5d80f16937c22e406fa29f03943264234511c7169e24fd8224c28b\" pid:6672 exited_at:{seconds:1748316288 nanos:938979043}" May 27 03:24:49.513929 containerd[1564]: time="2025-05-27T03:24:49.513801662Z" level=warning msg="container event discarded" container=669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12 type=CONTAINER_CREATED_EVENT May 27 03:24:49.513929 containerd[1564]: time="2025-05-27T03:24:49.513907493Z" level=warning msg="container event discarded" container=669ef59fe45261606ef1f89b6a559e6255fba4ada0ca55826ceee8cb896b5d12 type=CONTAINER_STARTED_EVENT May 27 03:24:49.538256 containerd[1564]: time="2025-05-27T03:24:49.538163042Z" level=warning msg="container event discarded" container=b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a type=CONTAINER_CREATED_EVENT May 27 03:24:49.610390 containerd[1564]: time="2025-05-27T03:24:49.610307149Z" level=warning msg="container event discarded" container=b7a9b500936902a9cb7d54009f070f10e379bbacd0da8c4deacddd7664034b2a type=CONTAINER_STARTED_EVENT May 27 03:24:49.750467 containerd[1564]: time="2025-05-27T03:24:49.750376872Z" level=warning msg="container event discarded" container=4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe type=CONTAINER_CREATED_EVENT May 27 03:24:49.750467 containerd[1564]: time="2025-05-27T03:24:49.750456673Z" level=warning msg="container event discarded" container=4a9eba9de521bbc0128a97f0cc62cd271604b2d2f0611a3ac52ab1dad83af9fe type=CONTAINER_STARTED_EVENT May 27 03:24:51.380369 systemd[1]: Started sshd@63-10.0.0.98:22-10.0.0.1:57006.service - OpenSSH per-connection server daemon (10.0.0.1:57006). May 27 03:24:51.431712 sshd[6685]: Accepted publickey for core from 10.0.0.1 port 57006 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:51.433677 sshd-session[6685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:51.438624 systemd-logind[1549]: New session 64 of user core. May 27 03:24:51.446589 systemd[1]: Started session-64.scope - Session 64 of User core. May 27 03:24:51.592711 sshd[6687]: Connection closed by 10.0.0.1 port 57006 May 27 03:24:51.593013 sshd-session[6685]: pam_unix(sshd:session): session closed for user core May 27 03:24:51.597064 systemd[1]: sshd@63-10.0.0.98:22-10.0.0.1:57006.service: Deactivated successfully. May 27 03:24:51.599036 systemd[1]: session-64.scope: Deactivated successfully. May 27 03:24:51.599784 systemd-logind[1549]: Session 64 logged out. Waiting for processes to exit. May 27 03:24:51.600995 systemd-logind[1549]: Removed session 64. May 27 03:24:52.665580 containerd[1564]: time="2025-05-27T03:24:52.665492417Z" level=warning msg="container event discarded" container=a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d type=CONTAINER_CREATED_EVENT May 27 03:24:52.751883 containerd[1564]: time="2025-05-27T03:24:52.751787783Z" level=warning msg="container event discarded" container=a3d7d8446da9e94d1e8c5b6c84ad0daac607bfe1c764d4df7e7555e20b14ee4d type=CONTAINER_STARTED_EVENT May 27 03:24:56.610227 systemd[1]: Started sshd@64-10.0.0.98:22-10.0.0.1:44430.service - OpenSSH per-connection server daemon (10.0.0.1:44430). May 27 03:24:56.659653 sshd[6700]: Accepted publickey for core from 10.0.0.1 port 44430 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:24:56.661246 sshd-session[6700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:24:56.666056 systemd-logind[1549]: New session 65 of user core. May 27 03:24:56.675582 systemd[1]: Started session-65.scope - Session 65 of User core. May 27 03:24:56.785548 sshd[6702]: Connection closed by 10.0.0.1 port 44430 May 27 03:24:56.785835 sshd-session[6700]: pam_unix(sshd:session): session closed for user core May 27 03:24:56.789708 systemd[1]: sshd@64-10.0.0.98:22-10.0.0.1:44430.service: Deactivated successfully. May 27 03:24:56.791627 systemd[1]: session-65.scope: Deactivated successfully. May 27 03:24:56.792517 systemd-logind[1549]: Session 65 logged out. Waiting for processes to exit. May 27 03:24:56.793954 systemd-logind[1549]: Removed session 65. May 27 03:24:57.631118 kubelet[2684]: E0527 03:24:57.631050 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:24:58.927648 containerd[1564]: time="2025-05-27T03:24:58.927417793Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"54616b93b2846346d511bc32c0eff7172303baf23ad37f5679759f7a0658f241\" pid:6727 exited_at:{seconds:1748316298 nanos:926758416}" May 27 03:25:01.807214 systemd[1]: Started sshd@65-10.0.0.98:22-10.0.0.1:44440.service - OpenSSH per-connection server daemon (10.0.0.1:44440). May 27 03:25:01.931122 sshd[6740]: Accepted publickey for core from 10.0.0.1 port 44440 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:01.933006 sshd-session[6740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:01.937920 systemd-logind[1549]: New session 66 of user core. May 27 03:25:01.947593 systemd[1]: Started session-66.scope - Session 66 of User core. May 27 03:25:02.096802 sshd[6744]: Connection closed by 10.0.0.1 port 44440 May 27 03:25:02.097037 sshd-session[6740]: pam_unix(sshd:session): session closed for user core May 27 03:25:02.101761 systemd[1]: sshd@65-10.0.0.98:22-10.0.0.1:44440.service: Deactivated successfully. May 27 03:25:02.103949 systemd[1]: session-66.scope: Deactivated successfully. May 27 03:25:02.104785 systemd-logind[1549]: Session 66 logged out. Waiting for processes to exit. May 27 03:25:02.106026 systemd-logind[1549]: Removed session 66. May 27 03:25:02.502690 containerd[1564]: time="2025-05-27T03:25:02.502595392Z" level=warning msg="container event discarded" container=43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6 type=CONTAINER_CREATED_EVENT May 27 03:25:02.502690 containerd[1564]: time="2025-05-27T03:25:02.502653522Z" level=warning msg="container event discarded" container=43528414385075c5d230078aa4dea29b5170b177c266d62685b62fd7a64521f6 type=CONTAINER_STARTED_EVENT May 27 03:25:02.577002 containerd[1564]: time="2025-05-27T03:25:02.576930297Z" level=warning msg="container event discarded" container=8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1 type=CONTAINER_CREATED_EVENT May 27 03:25:02.577002 containerd[1564]: time="2025-05-27T03:25:02.576978789Z" level=warning msg="container event discarded" container=8b5999229a480e416ccfc473a74bb25b465077eced59be54f5837fe7b69152a1 type=CONTAINER_STARTED_EVENT May 27 03:25:02.631927 kubelet[2684]: E0527 03:25:02.631701 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:25:05.211970 containerd[1564]: time="2025-05-27T03:25:05.211816926Z" level=warning msg="container event discarded" container=a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390 type=CONTAINER_CREATED_EVENT May 27 03:25:05.292281 containerd[1564]: time="2025-05-27T03:25:05.292204420Z" level=warning msg="container event discarded" container=a21c19d9778fba644506543930257984a942dfb44f853df913f7c07c03b87390 type=CONTAINER_STARTED_EVENT May 27 03:25:06.446033 containerd[1564]: time="2025-05-27T03:25:06.445939913Z" level=warning msg="container event discarded" container=87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9 type=CONTAINER_CREATED_EVENT May 27 03:25:06.536374 containerd[1564]: time="2025-05-27T03:25:06.536269021Z" level=warning msg="container event discarded" container=87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9 type=CONTAINER_STARTED_EVENT May 27 03:25:07.113712 systemd[1]: Started sshd@66-10.0.0.98:22-10.0.0.1:57292.service - OpenSSH per-connection server daemon (10.0.0.1:57292). May 27 03:25:07.169025 sshd[6757]: Accepted publickey for core from 10.0.0.1 port 57292 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:07.170704 sshd-session[6757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:07.175472 systemd-logind[1549]: New session 67 of user core. May 27 03:25:07.181644 systemd[1]: Started session-67.scope - Session 67 of User core. May 27 03:25:07.290533 sshd[6759]: Connection closed by 10.0.0.1 port 57292 May 27 03:25:07.290851 sshd-session[6757]: pam_unix(sshd:session): session closed for user core May 27 03:25:07.295132 systemd[1]: sshd@66-10.0.0.98:22-10.0.0.1:57292.service: Deactivated successfully. May 27 03:25:07.297238 systemd[1]: session-67.scope: Deactivated successfully. May 27 03:25:07.298036 systemd-logind[1549]: Session 67 logged out. Waiting for processes to exit. May 27 03:25:07.299425 systemd-logind[1549]: Removed session 67. May 27 03:25:07.319309 containerd[1564]: time="2025-05-27T03:25:07.319228951Z" level=warning msg="container event discarded" container=87038f5275bfd7996fba99d52fab426382efcdd9b2a60e07cdaf5047192cb7e9 type=CONTAINER_STOPPED_EVENT May 27 03:25:11.199616 containerd[1564]: time="2025-05-27T03:25:11.199464456Z" level=warning msg="container event discarded" container=c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db type=CONTAINER_CREATED_EVENT May 27 03:25:11.279793 containerd[1564]: time="2025-05-27T03:25:11.279710471Z" level=warning msg="container event discarded" container=c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db type=CONTAINER_STARTED_EVENT May 27 03:25:12.306114 systemd[1]: Started sshd@67-10.0.0.98:22-10.0.0.1:57298.service - OpenSSH per-connection server daemon (10.0.0.1:57298). May 27 03:25:12.359384 sshd[6792]: Accepted publickey for core from 10.0.0.1 port 57298 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:12.360730 sshd-session[6792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:12.364836 systemd-logind[1549]: New session 68 of user core. May 27 03:25:12.372556 systemd[1]: Started session-68.scope - Session 68 of User core. May 27 03:25:12.488177 sshd[6794]: Connection closed by 10.0.0.1 port 57298 May 27 03:25:12.488524 sshd-session[6792]: pam_unix(sshd:session): session closed for user core May 27 03:25:12.493146 systemd[1]: sshd@67-10.0.0.98:22-10.0.0.1:57298.service: Deactivated successfully. May 27 03:25:12.495257 systemd[1]: session-68.scope: Deactivated successfully. May 27 03:25:12.496155 systemd-logind[1549]: Session 68 logged out. Waiting for processes to exit. May 27 03:25:12.497493 systemd-logind[1549]: Removed session 68. May 27 03:25:12.631172 kubelet[2684]: E0527 03:25:12.631032 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:25:13.351404 containerd[1564]: time="2025-05-27T03:25:13.351323149Z" level=warning msg="container event discarded" container=c0ae793088e23d908f9379800a7852e19300bf64aadc927102d05b5c50d0f0db type=CONTAINER_STOPPED_EVENT May 27 03:25:16.631534 kubelet[2684]: E0527 03:25:16.631475 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:25:17.505335 systemd[1]: Started sshd@68-10.0.0.98:22-10.0.0.1:36802.service - OpenSSH per-connection server daemon (10.0.0.1:36802). May 27 03:25:17.554459 sshd[6807]: Accepted publickey for core from 10.0.0.1 port 36802 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:17.556003 sshd-session[6807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:17.560308 systemd-logind[1549]: New session 69 of user core. May 27 03:25:17.568561 systemd[1]: Started session-69.scope - Session 69 of User core. May 27 03:25:17.680206 sshd[6809]: Connection closed by 10.0.0.1 port 36802 May 27 03:25:17.680581 sshd-session[6807]: pam_unix(sshd:session): session closed for user core May 27 03:25:17.685369 systemd[1]: sshd@68-10.0.0.98:22-10.0.0.1:36802.service: Deactivated successfully. May 27 03:25:17.687630 systemd[1]: session-69.scope: Deactivated successfully. May 27 03:25:17.688551 systemd-logind[1549]: Session 69 logged out. Waiting for processes to exit. May 27 03:25:17.689911 systemd-logind[1549]: Removed session 69. May 27 03:25:18.946732 containerd[1564]: time="2025-05-27T03:25:18.946687400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"e57416a1dc98111020e658d0266aceac0b0d151398e6095bbb5379b332ba637c\" pid:6832 exited_at:{seconds:1748316318 nanos:946479177}" May 27 03:25:22.692850 systemd[1]: Started sshd@69-10.0.0.98:22-10.0.0.1:36818.service - OpenSSH per-connection server daemon (10.0.0.1:36818). May 27 03:25:22.743099 sshd[6846]: Accepted publickey for core from 10.0.0.1 port 36818 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:22.744722 sshd-session[6846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:22.749178 systemd-logind[1549]: New session 70 of user core. May 27 03:25:22.756591 systemd[1]: Started session-70.scope - Session 70 of User core. May 27 03:25:22.865261 sshd[6848]: Connection closed by 10.0.0.1 port 36818 May 27 03:25:22.865614 sshd-session[6846]: pam_unix(sshd:session): session closed for user core May 27 03:25:22.870241 systemd[1]: sshd@69-10.0.0.98:22-10.0.0.1:36818.service: Deactivated successfully. May 27 03:25:22.872360 systemd[1]: session-70.scope: Deactivated successfully. May 27 03:25:22.873196 systemd-logind[1549]: Session 70 logged out. Waiting for processes to exit. May 27 03:25:22.874584 systemd-logind[1549]: Removed session 70. May 27 03:25:25.806309 containerd[1564]: time="2025-05-27T03:25:25.806248362Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"0ab766de49b52b84a469de19823ebc9dd589155aa598c0bc797dbc32a21b005e\" pid:6872 exited_at:{seconds:1748316325 nanos:806029860}" May 27 03:25:26.587083 containerd[1564]: time="2025-05-27T03:25:26.586960500Z" level=warning msg="container event discarded" container=09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe type=CONTAINER_CREATED_EVENT May 27 03:25:26.842371 containerd[1564]: time="2025-05-27T03:25:26.842182275Z" level=warning msg="container event discarded" container=09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe type=CONTAINER_STARTED_EVENT May 27 03:25:27.630938 kubelet[2684]: E0527 03:25:27.630868 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:25:27.883094 systemd[1]: Started sshd@70-10.0.0.98:22-10.0.0.1:41686.service - OpenSSH per-connection server daemon (10.0.0.1:41686). May 27 03:25:27.935055 sshd[6883]: Accepted publickey for core from 10.0.0.1 port 41686 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:27.936830 sshd-session[6883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:27.941560 systemd-logind[1549]: New session 71 of user core. May 27 03:25:27.956737 systemd[1]: Started session-71.scope - Session 71 of User core. May 27 03:25:28.079891 sshd[6885]: Connection closed by 10.0.0.1 port 41686 May 27 03:25:28.080271 sshd-session[6883]: pam_unix(sshd:session): session closed for user core May 27 03:25:28.085835 systemd[1]: sshd@70-10.0.0.98:22-10.0.0.1:41686.service: Deactivated successfully. May 27 03:25:28.088426 systemd[1]: session-71.scope: Deactivated successfully. May 27 03:25:28.089289 systemd-logind[1549]: Session 71 logged out. Waiting for processes to exit. May 27 03:25:28.091132 systemd-logind[1549]: Removed session 71. May 27 03:25:28.397245 containerd[1564]: time="2025-05-27T03:25:28.397175287Z" level=warning msg="container event discarded" container=2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a type=CONTAINER_CREATED_EVENT May 27 03:25:28.397245 containerd[1564]: time="2025-05-27T03:25:28.397224279Z" level=warning msg="container event discarded" container=2ea8b72b1da039bd085512dcd6220b3b0641e990d4e577ffee885613625b9b2a type=CONTAINER_STARTED_EVENT May 27 03:25:28.448497 containerd[1564]: time="2025-05-27T03:25:28.448421947Z" level=warning msg="container event discarded" container=ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25 type=CONTAINER_CREATED_EVENT May 27 03:25:28.524819 containerd[1564]: time="2025-05-27T03:25:28.524719487Z" level=warning msg="container event discarded" container=ad3561719d3cd09926ecfb8c1f5c161da0d65c575061c7bb6b1440edf5675e25 type=CONTAINER_STARTED_EVENT May 27 03:25:28.569195 containerd[1564]: time="2025-05-27T03:25:28.569117928Z" level=warning msg="container event discarded" container=1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801 type=CONTAINER_CREATED_EVENT May 27 03:25:28.569195 containerd[1564]: time="2025-05-27T03:25:28.569170298Z" level=warning msg="container event discarded" container=1e05e08f53176fb4e04d91534011a392f08db01a07c560837572062577ffb801 type=CONTAINER_STARTED_EVENT May 27 03:25:28.901155 containerd[1564]: time="2025-05-27T03:25:28.901100713Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"0ef19c266d12b37cd60af71a52c75f3d4190a844107786eae188c0e08343693d\" pid:6909 exited_at:{seconds:1748316328 nanos:900770619}" May 27 03:25:29.631407 kubelet[2684]: E0527 03:25:29.631329 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:25:33.099899 systemd[1]: Started sshd@71-10.0.0.98:22-10.0.0.1:41694.service - OpenSSH per-connection server daemon (10.0.0.1:41694). May 27 03:25:33.149728 sshd[6922]: Accepted publickey for core from 10.0.0.1 port 41694 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:33.151692 sshd-session[6922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:33.156839 systemd-logind[1549]: New session 72 of user core. May 27 03:25:33.161626 systemd[1]: Started session-72.scope - Session 72 of User core. May 27 03:25:33.272249 sshd[6924]: Connection closed by 10.0.0.1 port 41694 May 27 03:25:33.272631 sshd-session[6922]: pam_unix(sshd:session): session closed for user core May 27 03:25:33.276116 systemd[1]: sshd@71-10.0.0.98:22-10.0.0.1:41694.service: Deactivated successfully. May 27 03:25:33.278687 systemd[1]: session-72.scope: Deactivated successfully. May 27 03:25:33.280644 systemd-logind[1549]: Session 72 logged out. Waiting for processes to exit. May 27 03:25:33.282340 systemd-logind[1549]: Removed session 72. May 27 03:25:37.286254 containerd[1564]: time="2025-05-27T03:25:37.286165954Z" level=warning msg="container event discarded" container=287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea type=CONTAINER_CREATED_EVENT May 27 03:25:37.286254 containerd[1564]: time="2025-05-27T03:25:37.286232831Z" level=warning msg="container event discarded" container=287e8c1c28a5e13137cc7438650ca007be302ca265a9953a59ecd5efe519b1ea type=CONTAINER_STARTED_EVENT May 27 03:25:38.289194 systemd[1]: Started sshd@72-10.0.0.98:22-10.0.0.1:43640.service - OpenSSH per-connection server daemon (10.0.0.1:43640). May 27 03:25:38.346571 sshd[6938]: Accepted publickey for core from 10.0.0.1 port 43640 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:38.348373 sshd-session[6938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:38.352813 systemd-logind[1549]: New session 73 of user core. May 27 03:25:38.362579 systemd[1]: Started session-73.scope - Session 73 of User core. May 27 03:25:38.473763 sshd[6940]: Connection closed by 10.0.0.1 port 43640 May 27 03:25:38.474107 sshd-session[6938]: pam_unix(sshd:session): session closed for user core May 27 03:25:38.478729 systemd[1]: sshd@72-10.0.0.98:22-10.0.0.1:43640.service: Deactivated successfully. May 27 03:25:38.481027 systemd[1]: session-73.scope: Deactivated successfully. May 27 03:25:38.481846 systemd-logind[1549]: Session 73 logged out. Waiting for processes to exit. May 27 03:25:38.483560 systemd-logind[1549]: Removed session 73. May 27 03:25:38.850164 containerd[1564]: time="2025-05-27T03:25:38.850086535Z" level=warning msg="container event discarded" container=3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4 type=CONTAINER_CREATED_EVENT May 27 03:25:38.850164 containerd[1564]: time="2025-05-27T03:25:38.850141259Z" level=warning msg="container event discarded" container=3356a486a0ccf6d4461f28d91239c0278557da0a1dbd309057f403e571abc4b4 type=CONTAINER_STARTED_EVENT May 27 03:25:39.013578 containerd[1564]: time="2025-05-27T03:25:39.013485097Z" level=warning msg="container event discarded" container=6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d type=CONTAINER_CREATED_EVENT May 27 03:25:39.013578 containerd[1564]: time="2025-05-27T03:25:39.013561171Z" level=warning msg="container event discarded" container=6288f91c5b3da80c6719ce27cfcf1521ae32d9bfec8cb729f0b3c294f41ab32d type=CONTAINER_STARTED_EVENT May 27 03:25:39.339924 containerd[1564]: time="2025-05-27T03:25:39.339835257Z" level=warning msg="container event discarded" container=4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c type=CONTAINER_CREATED_EVENT May 27 03:25:39.339924 containerd[1564]: time="2025-05-27T03:25:39.339890892Z" level=warning msg="container event discarded" container=4556959d9bc2f6e29f3aefede877ffedf1faaaec5e7a90c38acf63022d38394c type=CONTAINER_STARTED_EVENT May 27 03:25:40.600242 containerd[1564]: time="2025-05-27T03:25:40.600140353Z" level=warning msg="container event discarded" container=169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa type=CONTAINER_CREATED_EVENT May 27 03:25:40.600242 containerd[1564]: time="2025-05-27T03:25:40.600219663Z" level=warning msg="container event discarded" container=169561ef0d28694975b0c9975de69ed14cc6bc341a4973bf525d68029b10ddfa type=CONTAINER_STARTED_EVENT May 27 03:25:41.632682 containerd[1564]: time="2025-05-27T03:25:41.632603671Z" level=warning msg="container event discarded" container=9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187 type=CONTAINER_CREATED_EVENT May 27 03:25:41.632682 containerd[1564]: time="2025-05-27T03:25:41.632659465Z" level=warning msg="container event discarded" container=9dfd090af25eaf6630bdc4b39a4bac3f8f04a491b25776c55556e507f83e6187 type=CONTAINER_STARTED_EVENT May 27 03:25:41.814820 containerd[1564]: time="2025-05-27T03:25:41.814743672Z" level=warning msg="container event discarded" container=449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0 type=CONTAINER_CREATED_EVENT May 27 03:25:41.814820 containerd[1564]: time="2025-05-27T03:25:41.814793316Z" level=warning msg="container event discarded" container=c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099 type=CONTAINER_CREATED_EVENT May 27 03:25:41.884376 containerd[1564]: time="2025-05-27T03:25:41.884204358Z" level=warning msg="container event discarded" container=449c1641f591f4fde480b93f25e35129ed69ef5ba1eef082aea7467d053c17e0 type=CONTAINER_STARTED_EVENT May 27 03:25:41.912311 containerd[1564]: time="2025-05-27T03:25:41.912278760Z" level=warning msg="container event discarded" container=c22177655efe1c6453969e1f4931db51c9c9d2fd33237ddfc65e24e040bba099 type=CONTAINER_STARTED_EVENT May 27 03:25:42.631375 kubelet[2684]: E0527 03:25:42.631294 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:25:42.632401 kubelet[2684]: E0527 03:25:42.632039 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:25:43.499687 systemd[1]: Started sshd@73-10.0.0.98:22-10.0.0.1:33760.service - OpenSSH per-connection server daemon (10.0.0.1:33760). May 27 03:25:43.558114 sshd[6953]: Accepted publickey for core from 10.0.0.1 port 33760 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:43.559604 sshd-session[6953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:43.563924 systemd-logind[1549]: New session 74 of user core. May 27 03:25:43.572565 systemd[1]: Started session-74.scope - Session 74 of User core. May 27 03:25:43.675370 sshd[6955]: Connection closed by 10.0.0.1 port 33760 May 27 03:25:43.675703 sshd-session[6953]: pam_unix(sshd:session): session closed for user core May 27 03:25:43.679999 systemd[1]: sshd@73-10.0.0.98:22-10.0.0.1:33760.service: Deactivated successfully. May 27 03:25:43.682183 systemd[1]: session-74.scope: Deactivated successfully. May 27 03:25:43.682962 systemd-logind[1549]: Session 74 logged out. Waiting for processes to exit. May 27 03:25:43.684149 systemd-logind[1549]: Removed session 74. May 27 03:25:44.490561 containerd[1564]: time="2025-05-27T03:25:44.490491447Z" level=warning msg="container event discarded" container=fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a type=CONTAINER_CREATED_EVENT May 27 03:25:44.618800 containerd[1564]: time="2025-05-27T03:25:44.618719323Z" level=warning msg="container event discarded" container=fdd168b94199d75c781031878488439ee72249626b661bb8dc3dc56a4ff4071a type=CONTAINER_STARTED_EVENT May 27 03:25:44.872644 containerd[1564]: time="2025-05-27T03:25:44.872510546Z" level=warning msg="container event discarded" container=9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306 type=CONTAINER_CREATED_EVENT May 27 03:25:44.951784 containerd[1564]: time="2025-05-27T03:25:44.951750037Z" level=warning msg="container event discarded" container=9fa73b31e1aa64ce6e2bf19d30bea49b1f3e81d0f75f4f8c27826e0ad7615306 type=CONTAINER_STARTED_EVENT May 27 03:25:48.215195 containerd[1564]: time="2025-05-27T03:25:48.215097225Z" level=warning msg="container event discarded" container=b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956 type=CONTAINER_CREATED_EVENT May 27 03:25:48.363140 containerd[1564]: time="2025-05-27T03:25:48.363040684Z" level=warning msg="container event discarded" container=b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956 type=CONTAINER_STARTED_EVENT May 27 03:25:48.700764 systemd[1]: Started sshd@74-10.0.0.98:22-10.0.0.1:33776.service - OpenSSH per-connection server daemon (10.0.0.1:33776). May 27 03:25:48.765823 sshd[6970]: Accepted publickey for core from 10.0.0.1 port 33776 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:48.767827 sshd-session[6970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:48.772725 systemd-logind[1549]: New session 75 of user core. May 27 03:25:48.788767 systemd[1]: Started session-75.scope - Session 75 of User core. May 27 03:25:48.898582 sshd[6972]: Connection closed by 10.0.0.1 port 33776 May 27 03:25:48.899030 sshd-session[6970]: pam_unix(sshd:session): session closed for user core May 27 03:25:48.903598 systemd[1]: sshd@74-10.0.0.98:22-10.0.0.1:33776.service: Deactivated successfully. May 27 03:25:48.907371 systemd[1]: session-75.scope: Deactivated successfully. May 27 03:25:48.908611 systemd-logind[1549]: Session 75 logged out. Waiting for processes to exit. May 27 03:25:48.910350 systemd-logind[1549]: Removed session 75. May 27 03:25:48.946297 containerd[1564]: time="2025-05-27T03:25:48.946247450Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"27b4775ab7562c28ea3494bfb06b2d987a51834f3cc12f861c186ca64d85205e\" pid:6997 exited_at:{seconds:1748316348 nanos:945949819}" May 27 03:25:50.536399 containerd[1564]: time="2025-05-27T03:25:50.536315701Z" level=warning msg="container event discarded" container=664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0 type=CONTAINER_CREATED_EVENT May 27 03:25:50.612644 containerd[1564]: time="2025-05-27T03:25:50.612565134Z" level=warning msg="container event discarded" container=664c5053951c221cd53bdef5e2d8c39fb64803028bc6d45dec2906eb0662caa0 type=CONTAINER_STARTED_EVENT May 27 03:25:53.916069 systemd[1]: Started sshd@75-10.0.0.98:22-10.0.0.1:42468.service - OpenSSH per-connection server daemon (10.0.0.1:42468). May 27 03:25:53.968810 sshd[7017]: Accepted publickey for core from 10.0.0.1 port 42468 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:53.970386 sshd-session[7017]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:53.974912 systemd-logind[1549]: New session 76 of user core. May 27 03:25:53.981578 systemd[1]: Started session-76.scope - Session 76 of User core. May 27 03:25:54.089153 sshd[7019]: Connection closed by 10.0.0.1 port 42468 May 27 03:25:54.089509 sshd-session[7017]: pam_unix(sshd:session): session closed for user core May 27 03:25:54.093754 systemd[1]: sshd@75-10.0.0.98:22-10.0.0.1:42468.service: Deactivated successfully. May 27 03:25:54.095776 systemd[1]: session-76.scope: Deactivated successfully. May 27 03:25:54.096708 systemd-logind[1549]: Session 76 logged out. Waiting for processes to exit. May 27 03:25:54.098010 systemd-logind[1549]: Removed session 76. May 27 03:25:55.631520 kubelet[2684]: E0527 03:25:55.631457 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:25:56.630566 kubelet[2684]: E0527 03:25:56.630495 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:25:58.923812 containerd[1564]: time="2025-05-27T03:25:58.923761102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"409384c4bb4cf8252c750b1f13c1e4503b4f6ae2b8ebffded53817d50a9a4cd9\" pid:7042 exited_at:{seconds:1748316358 nanos:923366678}" May 27 03:25:59.106131 systemd[1]: Started sshd@76-10.0.0.98:22-10.0.0.1:42472.service - OpenSSH per-connection server daemon (10.0.0.1:42472). May 27 03:25:59.163721 sshd[7056]: Accepted publickey for core from 10.0.0.1 port 42472 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:25:59.166029 sshd-session[7056]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:59.171170 systemd-logind[1549]: New session 77 of user core. May 27 03:25:59.182588 systemd[1]: Started session-77.scope - Session 77 of User core. May 27 03:25:59.293248 sshd[7058]: Connection closed by 10.0.0.1 port 42472 May 27 03:25:59.293681 sshd-session[7056]: pam_unix(sshd:session): session closed for user core May 27 03:25:59.298850 systemd[1]: sshd@76-10.0.0.98:22-10.0.0.1:42472.service: Deactivated successfully. May 27 03:25:59.301278 systemd[1]: session-77.scope: Deactivated successfully. May 27 03:25:59.302161 systemd-logind[1549]: Session 77 logged out. Waiting for processes to exit. May 27 03:25:59.303664 systemd-logind[1549]: Removed session 77. May 27 03:26:04.307727 systemd[1]: Started sshd@77-10.0.0.98:22-10.0.0.1:38374.service - OpenSSH per-connection server daemon (10.0.0.1:38374). May 27 03:26:04.362995 sshd[7074]: Accepted publickey for core from 10.0.0.1 port 38374 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:04.364423 sshd-session[7074]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:04.369156 systemd-logind[1549]: New session 78 of user core. May 27 03:26:04.376578 systemd[1]: Started session-78.scope - Session 78 of User core. May 27 03:26:04.486279 sshd[7076]: Connection closed by 10.0.0.1 port 38374 May 27 03:26:04.486694 sshd-session[7074]: pam_unix(sshd:session): session closed for user core May 27 03:26:04.491516 systemd[1]: sshd@77-10.0.0.98:22-10.0.0.1:38374.service: Deactivated successfully. May 27 03:26:04.493923 systemd[1]: session-78.scope: Deactivated successfully. May 27 03:26:04.494916 systemd-logind[1549]: Session 78 logged out. Waiting for processes to exit. May 27 03:26:04.496472 systemd-logind[1549]: Removed session 78. May 27 03:26:06.633644 kubelet[2684]: E0527 03:26:06.633568 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:26:09.502289 systemd[1]: Started sshd@78-10.0.0.98:22-10.0.0.1:38376.service - OpenSSH per-connection server daemon (10.0.0.1:38376). May 27 03:26:09.557003 sshd[7090]: Accepted publickey for core from 10.0.0.1 port 38376 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:09.558568 sshd-session[7090]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:09.562978 systemd-logind[1549]: New session 79 of user core. May 27 03:26:09.572575 systemd[1]: Started session-79.scope - Session 79 of User core. May 27 03:26:09.681326 sshd[7092]: Connection closed by 10.0.0.1 port 38376 May 27 03:26:09.681628 sshd-session[7090]: pam_unix(sshd:session): session closed for user core May 27 03:26:09.685805 systemd[1]: sshd@78-10.0.0.98:22-10.0.0.1:38376.service: Deactivated successfully. May 27 03:26:09.687806 systemd[1]: session-79.scope: Deactivated successfully. May 27 03:26:09.688570 systemd-logind[1549]: Session 79 logged out. Waiting for processes to exit. May 27 03:26:09.689710 systemd-logind[1549]: Removed session 79. May 27 03:26:11.630795 kubelet[2684]: E0527 03:26:11.630740 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:26:14.699216 systemd[1]: Started sshd@79-10.0.0.98:22-10.0.0.1:47766.service - OpenSSH per-connection server daemon (10.0.0.1:47766). May 27 03:26:14.757089 sshd[7105]: Accepted publickey for core from 10.0.0.1 port 47766 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:14.759016 sshd-session[7105]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:14.763693 systemd-logind[1549]: New session 80 of user core. May 27 03:26:14.775801 systemd[1]: Started session-80.scope - Session 80 of User core. May 27 03:26:14.884909 sshd[7107]: Connection closed by 10.0.0.1 port 47766 May 27 03:26:14.885259 sshd-session[7105]: pam_unix(sshd:session): session closed for user core May 27 03:26:14.889666 systemd[1]: sshd@79-10.0.0.98:22-10.0.0.1:47766.service: Deactivated successfully. May 27 03:26:14.892235 systemd[1]: session-80.scope: Deactivated successfully. May 27 03:26:14.893137 systemd-logind[1549]: Session 80 logged out. Waiting for processes to exit. May 27 03:26:14.894901 systemd-logind[1549]: Removed session 80. May 27 03:26:18.939806 containerd[1564]: time="2025-05-27T03:26:18.939725882Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"8accb912779c40762a26642c88bc6414ff432fda2e605d179b8a7b80f723557c\" pid:7131 exited_at:{seconds:1748316378 nanos:939264411}" May 27 03:26:19.630985 containerd[1564]: time="2025-05-27T03:26:19.630945890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:26:19.844990 containerd[1564]: time="2025-05-27T03:26:19.844934406Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:26:19.846389 containerd[1564]: time="2025-05-27T03:26:19.846349284Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:26:19.846540 containerd[1564]: time="2025-05-27T03:26:19.846466945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:26:19.846616 kubelet[2684]: E0527 03:26:19.846568 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:26:19.846983 kubelet[2684]: E0527 03:26:19.846634 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:26:19.846983 kubelet[2684]: E0527 03:26:19.846757 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:767452f2f0924610a3259c1e7259b93b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:26:19.848711 containerd[1564]: time="2025-05-27T03:26:19.848669139Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:26:19.902195 systemd[1]: Started sshd@80-10.0.0.98:22-10.0.0.1:47772.service - OpenSSH per-connection server daemon (10.0.0.1:47772). May 27 03:26:19.949447 sshd[7144]: Accepted publickey for core from 10.0.0.1 port 47772 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:19.951152 sshd-session[7144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:19.955976 systemd-logind[1549]: New session 81 of user core. May 27 03:26:19.965598 systemd[1]: Started session-81.scope - Session 81 of User core. May 27 03:26:20.072817 sshd[7146]: Connection closed by 10.0.0.1 port 47772 May 27 03:26:20.073166 sshd-session[7144]: pam_unix(sshd:session): session closed for user core May 27 03:26:20.076624 systemd[1]: sshd@80-10.0.0.98:22-10.0.0.1:47772.service: Deactivated successfully. May 27 03:26:20.078727 systemd[1]: session-81.scope: Deactivated successfully. May 27 03:26:20.080661 systemd-logind[1549]: Session 81 logged out. Waiting for processes to exit. May 27 03:26:20.081962 systemd-logind[1549]: Removed session 81. May 27 03:26:20.116775 containerd[1564]: time="2025-05-27T03:26:20.116728156Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:26:20.117861 containerd[1564]: time="2025-05-27T03:26:20.117824664Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:26:20.117966 containerd[1564]: time="2025-05-27T03:26:20.117876101Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:26:20.118129 kubelet[2684]: E0527 03:26:20.118065 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:26:20.118129 kubelet[2684]: E0527 03:26:20.118123 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:26:20.118287 kubelet[2684]: E0527 03:26:20.118244 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdkgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-556d57579d-8sdgb_calico-system(2c1ae830-ec2d-4322-b9d4-cc52915b6648): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:26:20.119446 kubelet[2684]: E0527 03:26:20.119390 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:26:25.089485 systemd[1]: Started sshd@81-10.0.0.98:22-10.0.0.1:41536.service - OpenSSH per-connection server daemon (10.0.0.1:41536). May 27 03:26:25.142941 sshd[7160]: Accepted publickey for core from 10.0.0.1 port 41536 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:25.144330 sshd-session[7160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:25.148528 systemd-logind[1549]: New session 82 of user core. May 27 03:26:25.157569 systemd[1]: Started session-82.scope - Session 82 of User core. May 27 03:26:25.261612 sshd[7162]: Connection closed by 10.0.0.1 port 41536 May 27 03:26:25.261962 sshd-session[7160]: pam_unix(sshd:session): session closed for user core May 27 03:26:25.266364 systemd[1]: sshd@81-10.0.0.98:22-10.0.0.1:41536.service: Deactivated successfully. May 27 03:26:25.268364 systemd[1]: session-82.scope: Deactivated successfully. May 27 03:26:25.269173 systemd-logind[1549]: Session 82 logged out. Waiting for processes to exit. May 27 03:26:25.270419 systemd-logind[1549]: Removed session 82. May 27 03:26:25.808888 containerd[1564]: time="2025-05-27T03:26:25.808828024Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"01b7e015f9890cc444daf4d1d93bef78a964ba1a8e2e2b05384bdd2210aa1ba2\" pid:7185 exited_at:{seconds:1748316385 nanos:808590226}" May 27 03:26:26.631108 containerd[1564]: time="2025-05-27T03:26:26.631053492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:26:26.883633 containerd[1564]: time="2025-05-27T03:26:26.883502273Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:26:26.884731 containerd[1564]: time="2025-05-27T03:26:26.884686016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:26:26.884817 containerd[1564]: time="2025-05-27T03:26:26.884753032Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:26:26.884917 kubelet[2684]: E0527 03:26:26.884876 2684 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:26:26.885202 kubelet[2684]: E0527 03:26:26.884923 2684 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:26:26.885202 kubelet[2684]: E0527 03:26:26.885065 2684 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sf2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-nkjnr_calico-system(a883863a-d79b-4a80-911d-ab857d7d891b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:26:26.889325 kubelet[2684]: E0527 03:26:26.889276 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:26:28.906275 containerd[1564]: time="2025-05-27T03:26:28.906223377Z" level=info msg="TaskExit event in podsandbox handler container_id:\"09d3944e51fd3a6c6a65b070f5fc8228e51255eec33305e931f7699286e81bbe\" id:\"7ef625870d20dc3b6d2c5736afe4b362ce5b4827a55270b1b34ae57f1ec4e8e4\" pid:7207 exited_at:{seconds:1748316388 nanos:905864200}" May 27 03:26:30.278308 systemd[1]: Started sshd@82-10.0.0.98:22-10.0.0.1:41542.service - OpenSSH per-connection server daemon (10.0.0.1:41542). May 27 03:26:30.331590 sshd[7220]: Accepted publickey for core from 10.0.0.1 port 41542 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:30.333179 sshd-session[7220]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:30.338163 systemd-logind[1549]: New session 83 of user core. May 27 03:26:30.346608 systemd[1]: Started session-83.scope - Session 83 of User core. May 27 03:26:30.454358 sshd[7222]: Connection closed by 10.0.0.1 port 41542 May 27 03:26:30.454716 sshd-session[7220]: pam_unix(sshd:session): session closed for user core May 27 03:26:30.459254 systemd[1]: sshd@82-10.0.0.98:22-10.0.0.1:41542.service: Deactivated successfully. May 27 03:26:30.461652 systemd[1]: session-83.scope: Deactivated successfully. May 27 03:26:30.462502 systemd-logind[1549]: Session 83 logged out. Waiting for processes to exit. May 27 03:26:30.463777 systemd-logind[1549]: Removed session 83. May 27 03:26:33.631556 kubelet[2684]: E0527 03:26:33.631399 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:26:35.480116 systemd[1]: Started sshd@83-10.0.0.98:22-10.0.0.1:56678.service - OpenSSH per-connection server daemon (10.0.0.1:56678). May 27 03:26:35.533184 sshd[7236]: Accepted publickey for core from 10.0.0.1 port 56678 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:35.534733 sshd-session[7236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:35.539295 systemd-logind[1549]: New session 84 of user core. May 27 03:26:35.543578 systemd[1]: Started session-84.scope - Session 84 of User core. May 27 03:26:35.653379 sshd[7238]: Connection closed by 10.0.0.1 port 56678 May 27 03:26:35.653734 sshd-session[7236]: pam_unix(sshd:session): session closed for user core May 27 03:26:35.658508 systemd[1]: sshd@83-10.0.0.98:22-10.0.0.1:56678.service: Deactivated successfully. May 27 03:26:35.660872 systemd[1]: session-84.scope: Deactivated successfully. May 27 03:26:35.661686 systemd-logind[1549]: Session 84 logged out. Waiting for processes to exit. May 27 03:26:35.662987 systemd-logind[1549]: Removed session 84. May 27 03:26:40.630826 kubelet[2684]: E0527 03:26:40.630694 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-nkjnr" podUID="a883863a-d79b-4a80-911d-ab857d7d891b" May 27 03:26:40.667782 systemd[1]: Started sshd@84-10.0.0.98:22-10.0.0.1:56694.service - OpenSSH per-connection server daemon (10.0.0.1:56694). May 27 03:26:40.715904 sshd[7252]: Accepted publickey for core from 10.0.0.1 port 56694 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:40.717707 sshd-session[7252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:40.722030 systemd-logind[1549]: New session 85 of user core. May 27 03:26:40.732566 systemd[1]: Started session-85.scope - Session 85 of User core. May 27 03:26:40.842083 sshd[7254]: Connection closed by 10.0.0.1 port 56694 May 27 03:26:40.842397 sshd-session[7252]: pam_unix(sshd:session): session closed for user core May 27 03:26:40.847189 systemd[1]: sshd@84-10.0.0.98:22-10.0.0.1:56694.service: Deactivated successfully. May 27 03:26:40.849523 systemd[1]: session-85.scope: Deactivated successfully. May 27 03:26:40.850451 systemd-logind[1549]: Session 85 logged out. Waiting for processes to exit. May 27 03:26:40.852223 systemd-logind[1549]: Removed session 85. May 27 03:26:45.855707 systemd[1]: Started sshd@85-10.0.0.98:22-10.0.0.1:38554.service - OpenSSH per-connection server daemon (10.0.0.1:38554). May 27 03:26:45.904605 sshd[7289]: Accepted publickey for core from 10.0.0.1 port 38554 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:45.905989 sshd-session[7289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:45.910630 systemd-logind[1549]: New session 86 of user core. May 27 03:26:45.918564 systemd[1]: Started session-86.scope - Session 86 of User core. May 27 03:26:46.026916 sshd[7291]: Connection closed by 10.0.0.1 port 38554 May 27 03:26:46.027261 sshd-session[7289]: pam_unix(sshd:session): session closed for user core May 27 03:26:46.032124 systemd[1]: sshd@85-10.0.0.98:22-10.0.0.1:38554.service: Deactivated successfully. May 27 03:26:46.034159 systemd[1]: session-86.scope: Deactivated successfully. May 27 03:26:46.034991 systemd-logind[1549]: Session 86 logged out. Waiting for processes to exit. May 27 03:26:46.036221 systemd-logind[1549]: Removed session 86. May 27 03:26:48.632052 kubelet[2684]: E0527 03:26:48.631804 2684 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-556d57579d-8sdgb" podUID="2c1ae830-ec2d-4322-b9d4-cc52915b6648" May 27 03:26:48.941651 containerd[1564]: time="2025-05-27T03:26:48.941575839Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b39f72b14b6e9af0e305de0e74feccde9391a0ff56598ec343df0cf2b75d4956\" id:\"2a49502ed84a862a36fa1a8ff3ccd1158dd05aa443285849318468eff8b1d478\" pid:7316 exited_at:{seconds:1748316408 nanos:941080976}" May 27 03:26:51.039963 systemd[1]: Started sshd@86-10.0.0.98:22-10.0.0.1:38556.service - OpenSSH per-connection server daemon (10.0.0.1:38556). May 27 03:26:51.095134 sshd[7329]: Accepted publickey for core from 10.0.0.1 port 38556 ssh2: RSA SHA256:HN4XbSDsEGXKsR+b1h6s4C/jYnLvBqmszTOEbPjODIg May 27 03:26:51.096610 sshd-session[7329]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:51.101183 systemd-logind[1549]: New session 87 of user core. May 27 03:26:51.107579 systemd[1]: Started session-87.scope - Session 87 of User core. May 27 03:26:51.214207 sshd[7331]: Connection closed by 10.0.0.1 port 38556 May 27 03:26:51.214598 sshd-session[7329]: pam_unix(sshd:session): session closed for user core May 27 03:26:51.219423 systemd[1]: sshd@86-10.0.0.98:22-10.0.0.1:38556.service: Deactivated successfully. May 27 03:26:51.221821 systemd[1]: session-87.scope: Deactivated successfully. May 27 03:26:51.222975 systemd-logind[1549]: Session 87 logged out. Waiting for processes to exit. May 27 03:26:51.224623 systemd-logind[1549]: Removed session 87.