May 27 17:02:21.802955 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] May 27 17:02:21.802978 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 27 15:31:23 -00 2025 May 27 17:02:21.802988 kernel: KASLR enabled May 27 17:02:21.802994 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II May 27 17:02:21.802999 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 May 27 17:02:21.803005 kernel: random: crng init done May 27 17:02:21.803012 kernel: secureboot: Secure boot disabled May 27 17:02:21.803017 kernel: ACPI: Early table checksum verification disabled May 27 17:02:21.803023 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) May 27 17:02:21.803029 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) May 27 17:02:21.803036 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803042 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803047 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803053 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803060 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803068 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803074 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803080 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803086 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:02:21.803092 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) May 27 17:02:21.803098 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 May 27 17:02:21.803104 kernel: ACPI: Use ACPI SPCR as default console: Yes May 27 17:02:21.803110 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] May 27 17:02:21.803116 kernel: NODE_DATA(0) allocated [mem 0x13967ddc0-0x139684fff] May 27 17:02:21.803122 kernel: Zone ranges: May 27 17:02:21.803129 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] May 27 17:02:21.803135 kernel: DMA32 empty May 27 17:02:21.803141 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] May 27 17:02:21.803147 kernel: Device empty May 27 17:02:21.803152 kernel: Movable zone start for each node May 27 17:02:21.803158 kernel: Early memory node ranges May 27 17:02:21.803164 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] May 27 17:02:21.803170 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] May 27 17:02:21.803176 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] May 27 17:02:21.803182 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] May 27 17:02:21.803188 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] May 27 17:02:21.803195 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] May 27 17:02:21.803200 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] May 27 17:02:21.803208 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] May 27 17:02:21.803214 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] May 27 17:02:21.803223 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] May 27 17:02:21.803229 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges May 27 17:02:21.803235 kernel: psci: probing for conduit method from ACPI. May 27 17:02:21.803243 kernel: psci: PSCIv1.1 detected in firmware. May 27 17:02:21.803249 kernel: psci: Using standard PSCI v0.2 function IDs May 27 17:02:21.803255 kernel: psci: Trusted OS migration not required May 27 17:02:21.803262 kernel: psci: SMC Calling Convention v1.1 May 27 17:02:21.803268 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) May 27 17:02:21.803275 kernel: percpu: Embedded 33 pages/cpu s98136 r8192 d28840 u135168 May 27 17:02:21.803281 kernel: pcpu-alloc: s98136 r8192 d28840 u135168 alloc=33*4096 May 27 17:02:21.803288 kernel: pcpu-alloc: [0] 0 [0] 1 May 27 17:02:21.803294 kernel: Detected PIPT I-cache on CPU0 May 27 17:02:21.803301 kernel: CPU features: detected: GIC system register CPU interface May 27 17:02:21.803307 kernel: CPU features: detected: Spectre-v4 May 27 17:02:21.803315 kernel: CPU features: detected: Spectre-BHB May 27 17:02:21.803321 kernel: CPU features: kernel page table isolation forced ON by KASLR May 27 17:02:21.803328 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 27 17:02:21.803335 kernel: CPU features: detected: ARM erratum 1418040 May 27 17:02:21.803341 kernel: CPU features: detected: SSBS not fully self-synchronizing May 27 17:02:21.803348 kernel: alternatives: applying boot alternatives May 27 17:02:21.803355 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=4e706b869299e1c88703222069cdfa08c45ebce568f762053eea5b3f5f0939c3 May 27 17:02:21.804288 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 17:02:21.804306 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 17:02:21.804314 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 17:02:21.804327 kernel: Fallback order for Node 0: 0 May 27 17:02:21.804334 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 May 27 17:02:21.804340 kernel: Policy zone: Normal May 27 17:02:21.804347 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 17:02:21.804353 kernel: software IO TLB: area num 2. May 27 17:02:21.804360 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) May 27 17:02:21.804410 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 27 17:02:21.804417 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 17:02:21.804424 kernel: rcu: RCU event tracing is enabled. May 27 17:02:21.804431 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 27 17:02:21.804437 kernel: Trampoline variant of Tasks RCU enabled. May 27 17:02:21.804444 kernel: Tracing variant of Tasks RCU enabled. May 27 17:02:21.804452 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 17:02:21.804459 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 27 17:02:21.804466 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 17:02:21.804509 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 17:02:21.804519 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 27 17:02:21.804526 kernel: GICv3: 256 SPIs implemented May 27 17:02:21.804532 kernel: GICv3: 0 Extended SPIs implemented May 27 17:02:21.804539 kernel: Root IRQ handler: gic_handle_irq May 27 17:02:21.804545 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI May 27 17:02:21.804552 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 May 27 17:02:21.804558 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 May 27 17:02:21.804565 kernel: ITS [mem 0x08080000-0x0809ffff] May 27 17:02:21.804574 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) May 27 17:02:21.804581 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) May 27 17:02:21.804587 kernel: GICv3: using LPI property table @0x00000001000e0000 May 27 17:02:21.804594 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100100000 May 27 17:02:21.804601 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 17:02:21.804607 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 17:02:21.804614 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). May 27 17:02:21.804634 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns May 27 17:02:21.804643 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns May 27 17:02:21.804650 kernel: Console: colour dummy device 80x25 May 27 17:02:21.804657 kernel: ACPI: Core revision 20240827 May 27 17:02:21.804665 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) May 27 17:02:21.804672 kernel: pid_max: default: 32768 minimum: 301 May 27 17:02:21.804679 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 17:02:21.804686 kernel: landlock: Up and running. May 27 17:02:21.804692 kernel: SELinux: Initializing. May 27 17:02:21.804700 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 17:02:21.804707 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 17:02:21.804714 kernel: rcu: Hierarchical SRCU implementation. May 27 17:02:21.804721 kernel: rcu: Max phase no-delay instances is 400. May 27 17:02:21.804729 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 17:02:21.804736 kernel: Remapping and enabling EFI services. May 27 17:02:21.804743 kernel: smp: Bringing up secondary CPUs ... May 27 17:02:21.804750 kernel: Detected PIPT I-cache on CPU1 May 27 17:02:21.804757 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 May 27 17:02:21.804764 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100110000 May 27 17:02:21.804771 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 17:02:21.804777 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] May 27 17:02:21.804784 kernel: smp: Brought up 1 node, 2 CPUs May 27 17:02:21.804792 kernel: SMP: Total of 2 processors activated. May 27 17:02:21.804804 kernel: CPU: All CPU(s) started at EL1 May 27 17:02:21.804811 kernel: CPU features: detected: 32-bit EL0 Support May 27 17:02:21.804820 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 27 17:02:21.804827 kernel: CPU features: detected: Common not Private translations May 27 17:02:21.804834 kernel: CPU features: detected: CRC32 instructions May 27 17:02:21.804841 kernel: CPU features: detected: Enhanced Virtualization Traps May 27 17:02:21.804848 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 27 17:02:21.804857 kernel: CPU features: detected: LSE atomic instructions May 27 17:02:21.804864 kernel: CPU features: detected: Privileged Access Never May 27 17:02:21.804871 kernel: CPU features: detected: RAS Extension Support May 27 17:02:21.804878 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 27 17:02:21.804885 kernel: alternatives: applying system-wide alternatives May 27 17:02:21.804892 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 May 27 17:02:21.804900 kernel: Memory: 3876096K/4096000K available (11072K kernel code, 2276K rwdata, 8936K rodata, 39424K init, 1034K bss, 215028K reserved, 0K cma-reserved) May 27 17:02:21.804908 kernel: devtmpfs: initialized May 27 17:02:21.804915 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 17:02:21.804923 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 27 17:02:21.804930 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 27 17:02:21.804937 kernel: 0 pages in range for non-PLT usage May 27 17:02:21.804944 kernel: 508544 pages in range for PLT usage May 27 17:02:21.804951 kernel: pinctrl core: initialized pinctrl subsystem May 27 17:02:21.804959 kernel: SMBIOS 3.0.0 present. May 27 17:02:21.804966 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 May 27 17:02:21.804973 kernel: DMI: Memory slots populated: 1/1 May 27 17:02:21.804980 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 17:02:21.804989 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 27 17:02:21.804996 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 27 17:02:21.805003 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 27 17:02:21.805010 kernel: audit: initializing netlink subsys (disabled) May 27 17:02:21.805017 kernel: audit: type=2000 audit(0.022:1): state=initialized audit_enabled=0 res=1 May 27 17:02:21.805025 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 17:02:21.805032 kernel: cpuidle: using governor menu May 27 17:02:21.805039 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 27 17:02:21.805046 kernel: ASID allocator initialised with 32768 entries May 27 17:02:21.805054 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 17:02:21.805061 kernel: Serial: AMBA PL011 UART driver May 27 17:02:21.805068 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 17:02:21.805075 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 27 17:02:21.805082 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 27 17:02:21.805090 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 27 17:02:21.805097 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 17:02:21.805104 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 27 17:02:21.805111 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 27 17:02:21.805120 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 27 17:02:21.805126 kernel: ACPI: Added _OSI(Module Device) May 27 17:02:21.805133 kernel: ACPI: Added _OSI(Processor Device) May 27 17:02:21.805140 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 17:02:21.805147 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 17:02:21.805154 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 17:02:21.805161 kernel: ACPI: Interpreter enabled May 27 17:02:21.805168 kernel: ACPI: Using GIC for interrupt routing May 27 17:02:21.805175 kernel: ACPI: MCFG table detected, 1 entries May 27 17:02:21.805184 kernel: ACPI: CPU0 has been hot-added May 27 17:02:21.805191 kernel: ACPI: CPU1 has been hot-added May 27 17:02:21.805198 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA May 27 17:02:21.805205 kernel: printk: legacy console [ttyAMA0] enabled May 27 17:02:21.805212 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 27 17:02:21.805419 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 17:02:21.805517 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 27 17:02:21.805590 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 27 17:02:21.805661 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 May 27 17:02:21.805726 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] May 27 17:02:21.805736 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] May 27 17:02:21.805744 kernel: PCI host bridge to bus 0000:00 May 27 17:02:21.805821 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] May 27 17:02:21.805886 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] May 27 17:02:21.805949 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] May 27 17:02:21.806010 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 27 17:02:21.806093 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint May 27 17:02:21.806171 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint May 27 17:02:21.806234 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] May 27 17:02:21.806295 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] May 27 17:02:21.806459 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.806553 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] May 27 17:02:21.806617 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 27 17:02:21.806679 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] May 27 17:02:21.806767 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] May 27 17:02:21.806849 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.806916 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] May 27 17:02:21.806978 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 27 17:02:21.807053 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] May 27 17:02:21.807135 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.807207 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] May 27 17:02:21.807279 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 27 17:02:21.807355 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] May 27 17:02:21.807455 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] May 27 17:02:21.807588 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.807665 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] May 27 17:02:21.807746 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 27 17:02:21.807813 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] May 27 17:02:21.807876 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] May 27 17:02:21.807946 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.808032 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] May 27 17:02:21.808095 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 27 17:02:21.808160 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] May 27 17:02:21.808220 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] May 27 17:02:21.808288 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.808349 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] May 27 17:02:21.808492 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 27 17:02:21.808568 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] May 27 17:02:21.808629 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] May 27 17:02:21.808706 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.808769 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] May 27 17:02:21.808831 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 27 17:02:21.808891 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] May 27 17:02:21.808952 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] May 27 17:02:21.809025 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.809088 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] May 27 17:02:21.809154 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 27 17:02:21.809215 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] May 27 17:02:21.809284 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:02:21.809350 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] May 27 17:02:21.809506 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 27 17:02:21.809580 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] May 27 17:02:21.809657 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint May 27 17:02:21.809723 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] May 27 17:02:21.809808 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 27 17:02:21.809877 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] May 27 17:02:21.809945 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] May 27 17:02:21.810008 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] May 27 17:02:21.810089 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint May 27 17:02:21.810157 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] May 27 17:02:21.810228 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint May 27 17:02:21.810292 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] May 27 17:02:21.810354 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] May 27 17:02:21.810444 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint May 27 17:02:21.810553 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] May 27 17:02:21.810638 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint May 27 17:02:21.810704 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] May 27 17:02:21.810767 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] May 27 17:02:21.810839 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint May 27 17:02:21.810906 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] May 27 17:02:21.810973 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] May 27 17:02:21.811053 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 27 17:02:21.811121 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] May 27 17:02:21.811185 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] May 27 17:02:21.812576 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] May 27 17:02:21.812665 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 17:02:21.812729 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 May 27 17:02:21.812790 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 May 27 17:02:21.812857 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 17:02:21.812926 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 17:02:21.812986 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 May 27 17:02:21.813056 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 17:02:21.813133 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 May 27 17:02:21.813198 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 27 17:02:21.813267 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 17:02:21.813330 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 May 27 17:02:21.814622 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 17:02:21.814727 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 May 27 17:02:21.814799 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 May 27 17:02:21.814881 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 May 27 17:02:21.814965 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 27 17:02:21.815030 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 May 27 17:02:21.815099 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 May 27 17:02:21.815166 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 27 17:02:21.815227 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 May 27 17:02:21.815289 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 May 27 17:02:21.815356 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 27 17:02:21.815470 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 May 27 17:02:21.815595 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 May 27 17:02:21.815685 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 27 17:02:21.815752 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 May 27 17:02:21.815816 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 May 27 17:02:21.815882 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned May 27 17:02:21.815949 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned May 27 17:02:21.816014 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned May 27 17:02:21.816077 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned May 27 17:02:21.816145 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned May 27 17:02:21.816206 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned May 27 17:02:21.816279 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned May 27 17:02:21.816354 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned May 27 17:02:21.817601 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned May 27 17:02:21.817677 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned May 27 17:02:21.817744 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned May 27 17:02:21.817806 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned May 27 17:02:21.817879 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned May 27 17:02:21.817942 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned May 27 17:02:21.818027 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned May 27 17:02:21.818093 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned May 27 17:02:21.818168 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned May 27 17:02:21.818229 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned May 27 17:02:21.818296 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned May 27 17:02:21.818358 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned May 27 17:02:21.818448 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned May 27 17:02:21.818531 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned May 27 17:02:21.818599 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned May 27 17:02:21.818660 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned May 27 17:02:21.818729 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned May 27 17:02:21.818812 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned May 27 17:02:21.818880 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned May 27 17:02:21.818960 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned May 27 17:02:21.819043 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned May 27 17:02:21.819110 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned May 27 17:02:21.819202 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned May 27 17:02:21.819279 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned May 27 17:02:21.819349 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned May 27 17:02:21.820543 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned May 27 17:02:21.820646 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned May 27 17:02:21.820714 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned May 27 17:02:21.820790 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned May 27 17:02:21.820856 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned May 27 17:02:21.820925 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned May 27 17:02:21.820996 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned May 27 17:02:21.821072 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned May 27 17:02:21.821137 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned May 27 17:02:21.821202 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 27 17:02:21.821265 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] May 27 17:02:21.821326 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] May 27 17:02:21.821408 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] May 27 17:02:21.821495 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned May 27 17:02:21.821569 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 27 17:02:21.821631 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] May 27 17:02:21.821691 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] May 27 17:02:21.821754 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] May 27 17:02:21.821824 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned May 27 17:02:21.821888 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned May 27 17:02:21.821955 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 27 17:02:21.822030 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] May 27 17:02:21.822090 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] May 27 17:02:21.822151 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] May 27 17:02:21.822222 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned May 27 17:02:21.822285 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 27 17:02:21.822345 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] May 27 17:02:21.823430 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] May 27 17:02:21.823538 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] May 27 17:02:21.823636 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned May 27 17:02:21.823712 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned May 27 17:02:21.823780 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 27 17:02:21.823842 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] May 27 17:02:21.823904 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] May 27 17:02:21.823964 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] May 27 17:02:21.824042 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned May 27 17:02:21.824114 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned May 27 17:02:21.824191 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 27 17:02:21.824256 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] May 27 17:02:21.824318 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] May 27 17:02:21.824400 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] May 27 17:02:21.824486 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned May 27 17:02:21.824562 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned May 27 17:02:21.824627 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned May 27 17:02:21.824692 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 27 17:02:21.824768 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] May 27 17:02:21.824835 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] May 27 17:02:21.824901 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] May 27 17:02:21.824970 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 27 17:02:21.825037 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] May 27 17:02:21.825099 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] May 27 17:02:21.825164 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] May 27 17:02:21.825229 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 27 17:02:21.825292 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] May 27 17:02:21.825352 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] May 27 17:02:21.825597 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] May 27 17:02:21.825672 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] May 27 17:02:21.825740 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] May 27 17:02:21.825800 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] May 27 17:02:21.825869 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] May 27 17:02:21.825931 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] May 27 17:02:21.825998 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] May 27 17:02:21.826069 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] May 27 17:02:21.826132 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] May 27 17:02:21.826198 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] May 27 17:02:21.826268 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] May 27 17:02:21.826329 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] May 27 17:02:21.826445 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] May 27 17:02:21.826542 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] May 27 17:02:21.826603 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] May 27 17:02:21.826659 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] May 27 17:02:21.826722 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] May 27 17:02:21.826779 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] May 27 17:02:21.826839 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] May 27 17:02:21.826903 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] May 27 17:02:21.826960 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] May 27 17:02:21.827018 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] May 27 17:02:21.827087 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] May 27 17:02:21.827143 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] May 27 17:02:21.827204 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] May 27 17:02:21.827273 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] May 27 17:02:21.827330 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] May 27 17:02:21.828318 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] May 27 17:02:21.828435 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] May 27 17:02:21.828554 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] May 27 17:02:21.828618 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] May 27 17:02:21.828628 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 May 27 17:02:21.828643 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 May 27 17:02:21.828651 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 May 27 17:02:21.828659 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 May 27 17:02:21.828666 kernel: iommu: Default domain type: Translated May 27 17:02:21.828674 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 27 17:02:21.828682 kernel: efivars: Registered efivars operations May 27 17:02:21.828689 kernel: vgaarb: loaded May 27 17:02:21.828697 kernel: clocksource: Switched to clocksource arch_sys_counter May 27 17:02:21.828705 kernel: VFS: Disk quotas dquot_6.6.0 May 27 17:02:21.828715 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 17:02:21.828723 kernel: pnp: PnP ACPI init May 27 17:02:21.828842 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved May 27 17:02:21.828855 kernel: pnp: PnP ACPI: found 1 devices May 27 17:02:21.828863 kernel: NET: Registered PF_INET protocol family May 27 17:02:21.828871 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 27 17:02:21.828879 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 27 17:02:21.828886 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 17:02:21.828896 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 17:02:21.828904 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 27 17:02:21.828912 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 27 17:02:21.828919 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 17:02:21.828927 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 17:02:21.828935 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 17:02:21.829012 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) May 27 17:02:21.829024 kernel: PCI: CLS 0 bytes, default 64 May 27 17:02:21.829032 kernel: kvm [1]: HYP mode not available May 27 17:02:21.829041 kernel: Initialise system trusted keyrings May 27 17:02:21.829048 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 27 17:02:21.829055 kernel: Key type asymmetric registered May 27 17:02:21.829063 kernel: Asymmetric key parser 'x509' registered May 27 17:02:21.829070 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) May 27 17:02:21.829078 kernel: io scheduler mq-deadline registered May 27 17:02:21.829085 kernel: io scheduler kyber registered May 27 17:02:21.829093 kernel: io scheduler bfq registered May 27 17:02:21.829101 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 May 27 17:02:21.829171 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 May 27 17:02:21.829237 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 May 27 17:02:21.829299 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.831484 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 May 27 17:02:21.831637 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 May 27 17:02:21.831704 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.831802 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 May 27 17:02:21.831884 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 May 27 17:02:21.831951 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.832049 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 May 27 17:02:21.832130 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 May 27 17:02:21.832195 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.832276 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 May 27 17:02:21.832345 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 May 27 17:02:21.834357 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.834618 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 May 27 17:02:21.834687 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 May 27 17:02:21.834778 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.834861 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 May 27 17:02:21.834935 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 May 27 17:02:21.834998 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.835063 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 May 27 17:02:21.835125 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 May 27 17:02:21.835191 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.835203 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 May 27 17:02:21.835266 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 May 27 17:02:21.835329 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 May 27 17:02:21.836291 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:02:21.836318 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 27 17:02:21.836326 kernel: ACPI: button: Power Button [PWRB] May 27 17:02:21.836335 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 May 27 17:02:21.836451 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) May 27 17:02:21.836590 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) May 27 17:02:21.836605 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 17:02:21.836613 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 May 27 17:02:21.836682 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) May 27 17:02:21.836693 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A May 27 17:02:21.836700 kernel: thunder_xcv, ver 1.0 May 27 17:02:21.836708 kernel: thunder_bgx, ver 1.0 May 27 17:02:21.836716 kernel: nicpf, ver 1.0 May 27 17:02:21.836727 kernel: nicvf, ver 1.0 May 27 17:02:21.836807 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 27 17:02:21.836892 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-27T17:02:21 UTC (1748365341) May 27 17:02:21.836905 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 17:02:21.836912 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available May 27 17:02:21.836920 kernel: watchdog: NMI not fully supported May 27 17:02:21.836927 kernel: watchdog: Hard watchdog permanently disabled May 27 17:02:21.836936 kernel: NET: Registered PF_INET6 protocol family May 27 17:02:21.836946 kernel: Segment Routing with IPv6 May 27 17:02:21.836954 kernel: In-situ OAM (IOAM) with IPv6 May 27 17:02:21.836962 kernel: NET: Registered PF_PACKET protocol family May 27 17:02:21.836969 kernel: Key type dns_resolver registered May 27 17:02:21.836977 kernel: registered taskstats version 1 May 27 17:02:21.836984 kernel: Loading compiled-in X.509 certificates May 27 17:02:21.836992 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 8e5e45c34fa91568ef1fa3bdfd5a71a43b4c4580' May 27 17:02:21.837008 kernel: Demotion targets for Node 0: null May 27 17:02:21.837017 kernel: Key type .fscrypt registered May 27 17:02:21.837026 kernel: Key type fscrypt-provisioning registered May 27 17:02:21.837033 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 17:02:21.837040 kernel: ima: Allocated hash algorithm: sha1 May 27 17:02:21.837048 kernel: ima: No architecture policies found May 27 17:02:21.837055 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 27 17:02:21.837063 kernel: clk: Disabling unused clocks May 27 17:02:21.837070 kernel: PM: genpd: Disabling unused power domains May 27 17:02:21.837078 kernel: Warning: unable to open an initial console. May 27 17:02:21.837085 kernel: Freeing unused kernel memory: 39424K May 27 17:02:21.837094 kernel: Run /init as init process May 27 17:02:21.837102 kernel: with arguments: May 27 17:02:21.837109 kernel: /init May 27 17:02:21.837116 kernel: with environment: May 27 17:02:21.837123 kernel: HOME=/ May 27 17:02:21.837130 kernel: TERM=linux May 27 17:02:21.837138 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 17:02:21.837146 systemd[1]: Successfully made /usr/ read-only. May 27 17:02:21.837159 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:02:21.837168 systemd[1]: Detected virtualization kvm. May 27 17:02:21.837176 systemd[1]: Detected architecture arm64. May 27 17:02:21.837184 systemd[1]: Running in initrd. May 27 17:02:21.837191 systemd[1]: No hostname configured, using default hostname. May 27 17:02:21.837207 systemd[1]: Hostname set to . May 27 17:02:21.837215 systemd[1]: Initializing machine ID from VM UUID. May 27 17:02:21.837223 systemd[1]: Queued start job for default target initrd.target. May 27 17:02:21.837234 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:02:21.837243 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:02:21.837251 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 17:02:21.837259 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:02:21.837267 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 17:02:21.837276 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 17:02:21.837284 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 17:02:21.837294 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 17:02:21.837303 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:02:21.837311 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:02:21.837319 systemd[1]: Reached target paths.target - Path Units. May 27 17:02:21.837327 systemd[1]: Reached target slices.target - Slice Units. May 27 17:02:21.837334 systemd[1]: Reached target swap.target - Swaps. May 27 17:02:21.837342 systemd[1]: Reached target timers.target - Timer Units. May 27 17:02:21.837350 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:02:21.837391 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:02:21.837400 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 17:02:21.837408 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 17:02:21.837416 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:02:21.837424 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:02:21.837432 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:02:21.837440 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:02:21.837448 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 17:02:21.837456 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:02:21.837466 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 17:02:21.837484 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 17:02:21.837493 systemd[1]: Starting systemd-fsck-usr.service... May 27 17:02:21.837501 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:02:21.837509 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:02:21.837517 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:02:21.837525 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 17:02:21.837536 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:02:21.837577 systemd-journald[244]: Collecting audit messages is disabled. May 27 17:02:21.837600 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 17:02:21.837608 systemd[1]: Finished systemd-fsck-usr.service. May 27 17:02:21.837616 kernel: Bridge firewalling registered May 27 17:02:21.837624 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:02:21.837632 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:02:21.837640 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:02:21.837648 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:02:21.837658 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 17:02:21.837668 systemd-journald[244]: Journal started May 27 17:02:21.837686 systemd-journald[244]: Runtime Journal (/run/log/journal/de1d5d1355d74b63b1f0b7f38938bfb8) is 8M, max 76.5M, 68.5M free. May 27 17:02:21.790390 systemd-modules-load[245]: Inserted module 'overlay' May 27 17:02:21.840357 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:02:21.805753 systemd-modules-load[245]: Inserted module 'br_netfilter' May 27 17:02:21.841412 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:02:21.843227 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:02:21.848137 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:02:21.850643 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:02:21.866191 systemd-tmpfiles[270]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 17:02:21.869844 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:02:21.873592 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:02:21.876809 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:02:21.878854 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 17:02:21.881117 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:02:21.916501 dracut-cmdline[284]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=4e706b869299e1c88703222069cdfa08c45ebce568f762053eea5b3f5f0939c3 May 27 17:02:21.931521 systemd-resolved[285]: Positive Trust Anchors: May 27 17:02:21.931539 systemd-resolved[285]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:02:21.931571 systemd-resolved[285]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:02:21.942468 systemd-resolved[285]: Defaulting to hostname 'linux'. May 27 17:02:21.943631 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:02:21.944255 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:02:22.017424 kernel: SCSI subsystem initialized May 27 17:02:22.021405 kernel: Loading iSCSI transport class v2.0-870. May 27 17:02:22.029813 kernel: iscsi: registered transport (tcp) May 27 17:02:22.042407 kernel: iscsi: registered transport (qla4xxx) May 27 17:02:22.042510 kernel: QLogic iSCSI HBA Driver May 27 17:02:22.066402 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:02:22.096969 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:02:22.103950 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:02:22.157948 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 17:02:22.159938 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 17:02:22.230435 kernel: raid6: neonx8 gen() 15597 MB/s May 27 17:02:22.247443 kernel: raid6: neonx4 gen() 15729 MB/s May 27 17:02:22.264408 kernel: raid6: neonx2 gen() 13056 MB/s May 27 17:02:22.281439 kernel: raid6: neonx1 gen() 10419 MB/s May 27 17:02:22.298438 kernel: raid6: int64x8 gen() 6779 MB/s May 27 17:02:22.315436 kernel: raid6: int64x4 gen() 7284 MB/s May 27 17:02:22.332423 kernel: raid6: int64x2 gen() 6042 MB/s May 27 17:02:22.349426 kernel: raid6: int64x1 gen() 4995 MB/s May 27 17:02:22.349527 kernel: raid6: using algorithm neonx4 gen() 15729 MB/s May 27 17:02:22.366446 kernel: raid6: .... xor() 12297 MB/s, rmw enabled May 27 17:02:22.366559 kernel: raid6: using neon recovery algorithm May 27 17:02:22.371439 kernel: xor: measuring software checksum speed May 27 17:02:22.371538 kernel: 8regs : 21601 MB/sec May 27 17:02:22.371557 kernel: 32regs : 21687 MB/sec May 27 17:02:22.371573 kernel: arm64_neon : 26138 MB/sec May 27 17:02:22.372407 kernel: xor: using function: arm64_neon (26138 MB/sec) May 27 17:02:22.425430 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 17:02:22.434835 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 17:02:22.437410 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:02:22.464113 systemd-udevd[493]: Using default interface naming scheme 'v255'. May 27 17:02:22.468578 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:02:22.471977 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 17:02:22.501115 dracut-pre-trigger[500]: rd.md=0: removing MD RAID activation May 27 17:02:22.528140 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:02:22.530974 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:02:22.595773 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:02:22.601439 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 17:02:22.681391 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues May 27 17:02:22.683394 kernel: scsi host0: Virtio SCSI HBA May 27 17:02:22.686712 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 May 27 17:02:22.686777 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 May 27 17:02:22.728406 kernel: ACPI: bus type USB registered May 27 17:02:22.728457 kernel: usbcore: registered new interface driver usbfs May 27 17:02:22.729612 kernel: usbcore: registered new interface driver hub May 27 17:02:22.730394 kernel: usbcore: registered new device driver usb May 27 17:02:22.749003 kernel: sd 0:0:0:1: Power-on or device reset occurred May 27 17:02:22.749201 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 27 17:02:22.749303 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) May 27 17:02:22.750880 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 May 27 17:02:22.751058 kernel: sd 0:0:0:1: [sda] Write Protect is off May 27 17:02:22.751156 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 27 17:02:22.751309 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 May 27 17:02:22.751612 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:02:22.751735 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:02:22.753595 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:02:22.756586 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA May 27 17:02:22.756777 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 27 17:02:22.756937 kernel: sr 0:0:0:0: Power-on or device reset occurred May 27 17:02:22.757061 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 May 27 17:02:22.757843 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed May 27 17:02:22.758144 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:02:22.762407 kernel: hub 1-0:1.0: USB hub found May 27 17:02:22.762590 kernel: hub 1-0:1.0: 4 ports detected May 27 17:02:22.762669 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 27 17:02:22.762757 kernel: hub 2-0:1.0: USB hub found May 27 17:02:22.762838 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray May 27 17:02:22.762927 kernel: hub 2-0:1.0: 4 ports detected May 27 17:02:22.768587 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 17:02:22.768650 kernel: GPT:17805311 != 80003071 May 27 17:02:22.768663 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 17:02:22.768707 kernel: GPT:17805311 != 80003071 May 27 17:02:22.768722 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 17:02:22.768733 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 27 17:02:22.770413 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:02:22.771760 kernel: sd 0:0:0:1: [sda] Attached SCSI disk May 27 17:02:22.771915 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 May 27 17:02:22.793333 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:02:22.849033 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. May 27 17:02:22.857835 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. May 27 17:02:22.858511 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. May 27 17:02:22.870223 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. May 27 17:02:22.880242 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 27 17:02:22.885541 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 17:02:22.890558 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 17:02:22.892020 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:02:22.893465 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:02:22.894077 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:02:22.904658 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 17:02:22.912432 disk-uuid[599]: Primary Header is updated. May 27 17:02:22.912432 disk-uuid[599]: Secondary Entries is updated. May 27 17:02:22.912432 disk-uuid[599]: Secondary Header is updated. May 27 17:02:22.923432 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:02:22.930359 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 17:02:22.995397 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 27 17:02:23.129585 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 May 27 17:02:23.129661 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 May 27 17:02:23.130030 kernel: usbcore: registered new interface driver usbhid May 27 17:02:23.130073 kernel: usbhid: USB HID core driver May 27 17:02:23.235441 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd May 27 17:02:23.361511 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 May 27 17:02:23.413407 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 May 27 17:02:23.945456 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:02:23.946157 disk-uuid[600]: The operation has completed successfully. May 27 17:02:24.011323 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 17:02:24.011502 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 17:02:24.043857 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 17:02:24.071860 sh[623]: Success May 27 17:02:24.086389 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 17:02:24.086451 kernel: device-mapper: uevent: version 1.0.3 May 27 17:02:24.087395 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 17:02:24.098516 kernel: device-mapper: verity: sha256 using shash "sha256-ce" May 27 17:02:24.153496 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 17:02:24.155817 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 17:02:24.179597 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 17:02:24.191388 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 17:02:24.193392 kernel: BTRFS: device fsid 3c8c76ef-f1da-40fe-979d-11bdf765e403 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (635) May 27 17:02:24.195144 kernel: BTRFS info (device dm-0): first mount of filesystem 3c8c76ef-f1da-40fe-979d-11bdf765e403 May 27 17:02:24.195189 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 27 17:02:24.195201 kernel: BTRFS info (device dm-0): using free-space-tree May 27 17:02:24.203993 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 17:02:24.205585 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 17:02:24.206910 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 17:02:24.208173 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 17:02:24.211302 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 17:02:24.243413 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (666) May 27 17:02:24.245407 kernel: BTRFS info (device sda6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:02:24.245451 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 27 17:02:24.245475 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:02:24.259395 kernel: BTRFS info (device sda6): last unmount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:02:24.262412 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 17:02:24.264325 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 17:02:24.348716 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:02:24.352428 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:02:24.388099 systemd-networkd[806]: lo: Link UP May 27 17:02:24.388109 systemd-networkd[806]: lo: Gained carrier May 27 17:02:24.389767 systemd-networkd[806]: Enumeration completed May 27 17:02:24.389877 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:02:24.390595 systemd[1]: Reached target network.target - Network. May 27 17:02:24.391718 systemd-networkd[806]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:24.391721 systemd-networkd[806]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:02:24.392431 systemd-networkd[806]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:24.392435 systemd-networkd[806]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:02:24.393083 systemd-networkd[806]: eth0: Link UP May 27 17:02:24.393086 systemd-networkd[806]: eth0: Gained carrier May 27 17:02:24.393095 systemd-networkd[806]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:24.401045 systemd-networkd[806]: eth1: Link UP May 27 17:02:24.401049 systemd-networkd[806]: eth1: Gained carrier May 27 17:02:24.401062 systemd-networkd[806]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:24.415536 ignition[719]: Ignition 2.21.0 May 27 17:02:24.415548 ignition[719]: Stage: fetch-offline May 27 17:02:24.415589 ignition[719]: no configs at "/usr/lib/ignition/base.d" May 27 17:02:24.418054 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:02:24.415600 ignition[719]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:02:24.415874 ignition[719]: parsed url from cmdline: "" May 27 17:02:24.415878 ignition[719]: no config URL provided May 27 17:02:24.415884 ignition[719]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:02:24.423543 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 17:02:24.415891 ignition[719]: no config at "/usr/lib/ignition/user.ign" May 27 17:02:24.415896 ignition[719]: failed to fetch config: resource requires networking May 27 17:02:24.416083 ignition[719]: Ignition finished successfully May 27 17:02:24.428594 systemd-networkd[806]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 27 17:02:24.449853 ignition[815]: Ignition 2.21.0 May 27 17:02:24.449867 ignition[815]: Stage: fetch May 27 17:02:24.451503 systemd-networkd[806]: eth0: DHCPv4 address 91.99.121.210/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 27 17:02:24.450008 ignition[815]: no configs at "/usr/lib/ignition/base.d" May 27 17:02:24.450016 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:02:24.450098 ignition[815]: parsed url from cmdline: "" May 27 17:02:24.450101 ignition[815]: no config URL provided May 27 17:02:24.450105 ignition[815]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:02:24.450111 ignition[815]: no config at "/usr/lib/ignition/user.ign" May 27 17:02:24.450211 ignition[815]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 May 27 17:02:24.459789 ignition[815]: GET result: OK May 27 17:02:24.459934 ignition[815]: parsing config with SHA512: f146d7e9ec6017337f8b713eaf5aec6e819e713279194344867e43a1397fdcf2f1a2a2b9b95a309b193875e15a8f7871a31169a1a1fa2b8bda422431e0ebd6a8 May 27 17:02:24.469512 unknown[815]: fetched base config from "system" May 27 17:02:24.470086 unknown[815]: fetched base config from "system" May 27 17:02:24.470556 ignition[815]: fetch: fetch complete May 27 17:02:24.470093 unknown[815]: fetched user config from "hetzner" May 27 17:02:24.470563 ignition[815]: fetch: fetch passed May 27 17:02:24.470646 ignition[815]: Ignition finished successfully May 27 17:02:24.473679 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 17:02:24.475814 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 17:02:24.516613 ignition[823]: Ignition 2.21.0 May 27 17:02:24.516626 ignition[823]: Stage: kargs May 27 17:02:24.516792 ignition[823]: no configs at "/usr/lib/ignition/base.d" May 27 17:02:24.516802 ignition[823]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:02:24.520445 ignition[823]: kargs: kargs passed May 27 17:02:24.520921 ignition[823]: Ignition finished successfully May 27 17:02:24.524410 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 17:02:24.526887 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 17:02:24.554546 ignition[830]: Ignition 2.21.0 May 27 17:02:24.554563 ignition[830]: Stage: disks May 27 17:02:24.554706 ignition[830]: no configs at "/usr/lib/ignition/base.d" May 27 17:02:24.557621 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 17:02:24.554715 ignition[830]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:02:24.560208 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 17:02:24.556197 ignition[830]: disks: disks passed May 27 17:02:24.561871 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 17:02:24.556255 ignition[830]: Ignition finished successfully May 27 17:02:24.563157 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:02:24.564582 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:02:24.566108 systemd[1]: Reached target basic.target - Basic System. May 27 17:02:24.568252 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 17:02:24.596072 systemd-fsck[839]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 27 17:02:24.600949 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 17:02:24.605080 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 17:02:24.686448 kernel: EXT4-fs (sda9): mounted filesystem a5483afc-8426-4c3e-85ef-8146f9077e7d r/w with ordered data mode. Quota mode: none. May 27 17:02:24.687883 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 17:02:24.689733 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 17:02:24.693346 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:02:24.695706 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 17:02:24.708811 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 27 17:02:24.713975 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 17:02:24.714041 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:02:24.719736 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 17:02:24.721948 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 17:02:24.729000 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (847) May 27 17:02:24.730727 kernel: BTRFS info (device sda6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:02:24.730773 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 27 17:02:24.731423 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:02:24.751945 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:02:24.780020 initrd-setup-root[874]: cut: /sysroot/etc/passwd: No such file or directory May 27 17:02:24.783566 coreos-metadata[849]: May 27 17:02:24.783 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 May 27 17:02:24.785033 coreos-metadata[849]: May 27 17:02:24.785 INFO Fetch successful May 27 17:02:24.785852 coreos-metadata[849]: May 27 17:02:24.785 INFO wrote hostname ci-4344-0-0-0-39ed1690e8 to /sysroot/etc/hostname May 27 17:02:24.788097 initrd-setup-root[881]: cut: /sysroot/etc/group: No such file or directory May 27 17:02:24.789268 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 17:02:24.794412 initrd-setup-root[889]: cut: /sysroot/etc/shadow: No such file or directory May 27 17:02:24.799345 initrd-setup-root[896]: cut: /sysroot/etc/gshadow: No such file or directory May 27 17:02:24.906538 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 17:02:24.911644 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 17:02:24.913862 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 17:02:24.928403 kernel: BTRFS info (device sda6): last unmount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:02:24.950756 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 17:02:24.959343 ignition[965]: INFO : Ignition 2.21.0 May 27 17:02:24.959343 ignition[965]: INFO : Stage: mount May 27 17:02:24.960739 ignition[965]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:02:24.960739 ignition[965]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:02:24.960739 ignition[965]: INFO : mount: mount passed May 27 17:02:24.963510 ignition[965]: INFO : Ignition finished successfully May 27 17:02:24.963436 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 17:02:24.966439 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 17:02:25.193737 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 17:02:25.198570 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:02:25.223945 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (976) May 27 17:02:25.224017 kernel: BTRFS info (device sda6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:02:25.224036 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 27 17:02:25.224786 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:02:25.229775 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:02:25.261690 ignition[993]: INFO : Ignition 2.21.0 May 27 17:02:25.261690 ignition[993]: INFO : Stage: files May 27 17:02:25.262922 ignition[993]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:02:25.262922 ignition[993]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:02:25.262922 ignition[993]: DEBUG : files: compiled without relabeling support, skipping May 27 17:02:25.265449 ignition[993]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 17:02:25.265449 ignition[993]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 17:02:25.267378 ignition[993]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 17:02:25.268357 ignition[993]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 17:02:25.269753 ignition[993]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 17:02:25.269207 unknown[993]: wrote ssh authorized keys file for user: core May 27 17:02:25.273485 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" May 27 17:02:25.274942 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 May 27 17:02:25.382258 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 17:02:25.665715 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" May 27 17:02:25.665715 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:02:25.670636 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:02:25.680469 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:02:25.680469 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:02:25.680469 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 May 27 17:02:26.081627 systemd-networkd[806]: eth0: Gained IPv6LL May 27 17:02:26.209691 systemd-networkd[806]: eth1: Gained IPv6LL May 27 17:02:26.312402 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 17:02:26.637887 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:02:26.637887 ignition[993]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 17:02:26.641622 ignition[993]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:02:26.645209 ignition[993]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:02:26.645209 ignition[993]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 17:02:26.645209 ignition[993]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 27 17:02:26.645209 ignition[993]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 27 17:02:26.645209 ignition[993]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 27 17:02:26.645209 ignition[993]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 27 17:02:26.645209 ignition[993]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" May 27 17:02:26.645209 ignition[993]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" May 27 17:02:26.656831 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 17:02:26.656831 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 17:02:26.656831 ignition[993]: INFO : files: files passed May 27 17:02:26.656831 ignition[993]: INFO : Ignition finished successfully May 27 17:02:26.649773 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 17:02:26.654098 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 17:02:26.658326 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 17:02:26.673354 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 17:02:26.673561 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 17:02:26.680330 initrd-setup-root-after-ignition[1023]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:02:26.680330 initrd-setup-root-after-ignition[1023]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 17:02:26.683718 initrd-setup-root-after-ignition[1027]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:02:26.686396 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:02:26.687443 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 17:02:26.689263 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 17:02:26.743866 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 17:02:26.745594 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 17:02:26.749438 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 17:02:26.750336 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 17:02:26.752221 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 17:02:26.753273 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 17:02:26.780135 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:02:26.782529 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 17:02:26.809665 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 17:02:26.811107 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:02:26.812519 systemd[1]: Stopped target timers.target - Timer Units. May 27 17:02:26.813646 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 17:02:26.814286 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:02:26.815938 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 17:02:26.817167 systemd[1]: Stopped target basic.target - Basic System. May 27 17:02:26.818345 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 17:02:26.819605 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:02:26.820264 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 17:02:26.822258 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 17:02:26.824165 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 17:02:26.825469 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:02:26.826546 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 17:02:26.827637 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 17:02:26.828583 systemd[1]: Stopped target swap.target - Swaps. May 27 17:02:26.829412 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 17:02:26.829586 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 17:02:26.830840 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 17:02:26.831993 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:02:26.833044 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 17:02:26.833125 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:02:26.834180 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 17:02:26.834309 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 17:02:26.835889 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 17:02:26.836020 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:02:26.837164 systemd[1]: ignition-files.service: Deactivated successfully. May 27 17:02:26.837271 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 17:02:26.838161 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 27 17:02:26.838261 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 17:02:26.840150 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 17:02:26.844677 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 17:02:26.845189 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 17:02:26.846740 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:02:26.848247 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 17:02:26.849052 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:02:26.857787 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 17:02:26.857886 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 17:02:26.869393 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 17:02:26.875415 ignition[1047]: INFO : Ignition 2.21.0 May 27 17:02:26.875415 ignition[1047]: INFO : Stage: umount May 27 17:02:26.875415 ignition[1047]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:02:26.875415 ignition[1047]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:02:26.881042 ignition[1047]: INFO : umount: umount passed May 27 17:02:26.881042 ignition[1047]: INFO : Ignition finished successfully May 27 17:02:26.877811 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 17:02:26.879123 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 17:02:26.880905 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 17:02:26.881018 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 17:02:26.883142 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 17:02:26.883251 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 17:02:26.883989 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 17:02:26.884042 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 17:02:26.884839 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 17:02:26.884883 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 17:02:26.885743 systemd[1]: Stopped target network.target - Network. May 27 17:02:26.886609 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 17:02:26.886683 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:02:26.887566 systemd[1]: Stopped target paths.target - Path Units. May 27 17:02:26.888340 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 17:02:26.892457 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:02:26.893179 systemd[1]: Stopped target slices.target - Slice Units. May 27 17:02:26.894115 systemd[1]: Stopped target sockets.target - Socket Units. May 27 17:02:26.895044 systemd[1]: iscsid.socket: Deactivated successfully. May 27 17:02:26.895096 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:02:26.896219 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 17:02:26.896264 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:02:26.897381 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 17:02:26.897504 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 17:02:26.898809 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 17:02:26.898869 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 17:02:26.899691 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 17:02:26.899742 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 17:02:26.900782 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 17:02:26.901489 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 17:02:26.908961 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 17:02:26.909114 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 17:02:26.912729 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 17:02:26.913054 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 17:02:26.913096 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:02:26.915354 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 17:02:26.918519 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 17:02:26.918634 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 17:02:26.922335 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 17:02:26.922492 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 17:02:26.923692 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 17:02:26.923732 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 17:02:26.926063 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 17:02:26.928505 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 17:02:26.928593 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:02:26.930642 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 17:02:26.930707 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 17:02:26.932785 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 17:02:26.932830 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 17:02:26.933523 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:02:26.936723 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 17:02:26.947932 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 17:02:26.948699 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:02:26.949843 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 17:02:26.949911 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 17:02:26.951034 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 17:02:26.951065 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:02:26.952038 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 17:02:26.952093 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 17:02:26.953691 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 17:02:26.953739 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 17:02:26.954988 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 17:02:26.955031 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:02:26.956420 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 17:02:26.957232 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 17:02:26.957285 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:02:26.962137 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 17:02:26.962219 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:02:26.965515 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 17:02:26.965577 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:02:26.968727 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 17:02:26.968784 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:02:26.971557 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:02:26.971610 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:02:26.975216 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 17:02:26.976401 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 17:02:26.977974 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 17:02:26.978064 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 17:02:26.979868 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 17:02:26.984663 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 17:02:27.006114 systemd[1]: Switching root. May 27 17:02:27.036296 systemd-journald[244]: Journal stopped May 27 17:02:27.993655 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). May 27 17:02:27.993727 kernel: SELinux: policy capability network_peer_controls=1 May 27 17:02:27.993739 kernel: SELinux: policy capability open_perms=1 May 27 17:02:27.993748 kernel: SELinux: policy capability extended_socket_class=1 May 27 17:02:27.993760 kernel: SELinux: policy capability always_check_network=0 May 27 17:02:27.993772 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 17:02:27.993781 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 17:02:27.993790 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 17:02:27.993799 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 17:02:27.993808 kernel: SELinux: policy capability userspace_initial_context=0 May 27 17:02:27.993817 kernel: audit: type=1403 audit(1748365347.213:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 17:02:27.993828 systemd[1]: Successfully loaded SELinux policy in 41.898ms. May 27 17:02:27.993847 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.727ms. May 27 17:02:27.993860 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:02:27.993870 systemd[1]: Detected virtualization kvm. May 27 17:02:27.993881 systemd[1]: Detected architecture arm64. May 27 17:02:27.993892 systemd[1]: Detected first boot. May 27 17:02:27.993901 systemd[1]: Hostname set to . May 27 17:02:27.993911 systemd[1]: Initializing machine ID from VM UUID. May 27 17:02:27.993927 zram_generator::config[1093]: No configuration found. May 27 17:02:27.993936 kernel: NET: Registered PF_VSOCK protocol family May 27 17:02:27.993947 systemd[1]: Populated /etc with preset unit settings. May 27 17:02:27.993958 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 17:02:27.993968 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 17:02:27.993977 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 17:02:27.993987 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 17:02:27.993997 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 17:02:27.994007 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 17:02:27.994016 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 17:02:27.994026 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 17:02:27.994038 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 17:02:27.994050 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 17:02:27.994062 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 17:02:27.994072 systemd[1]: Created slice user.slice - User and Session Slice. May 27 17:02:27.994082 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:02:27.994092 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:02:27.994104 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 17:02:27.994115 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 17:02:27.994125 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 17:02:27.994135 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:02:27.994145 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 27 17:02:27.994157 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:02:27.994167 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:02:27.994176 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 17:02:27.994186 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 17:02:27.994196 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 17:02:27.994205 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 17:02:27.994215 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:02:27.994225 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:02:27.994235 systemd[1]: Reached target slices.target - Slice Units. May 27 17:02:27.994245 systemd[1]: Reached target swap.target - Swaps. May 27 17:02:27.994256 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 17:02:27.994266 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 17:02:27.994276 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 17:02:27.994288 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:02:27.994298 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:02:27.994308 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:02:27.994318 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 17:02:27.994328 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 17:02:27.994338 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 17:02:27.994349 systemd[1]: Mounting media.mount - External Media Directory... May 27 17:02:27.994359 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 17:02:27.994387 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 17:02:27.994397 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 17:02:27.994407 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 17:02:27.994417 systemd[1]: Reached target machines.target - Containers. May 27 17:02:27.994428 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 17:02:27.994438 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:02:27.994488 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:02:27.994500 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 17:02:27.994510 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:02:27.994520 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:02:27.994530 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:02:27.994540 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 17:02:27.994550 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:02:27.994560 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 17:02:27.994571 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 17:02:27.994583 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 17:02:27.994592 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 17:02:27.994602 systemd[1]: Stopped systemd-fsck-usr.service. May 27 17:02:27.994613 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:02:27.994623 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:02:27.994633 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:02:27.994644 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:02:27.994654 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 17:02:27.994664 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 17:02:27.994678 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:02:27.994690 systemd[1]: verity-setup.service: Deactivated successfully. May 27 17:02:27.994700 systemd[1]: Stopped verity-setup.service. May 27 17:02:27.994712 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 17:02:27.994722 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 17:02:27.994732 systemd[1]: Mounted media.mount - External Media Directory. May 27 17:02:27.994742 kernel: fuse: init (API version 7.41) May 27 17:02:27.994755 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 17:02:27.994765 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 17:02:27.994776 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 17:02:27.994787 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:02:27.994797 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 17:02:27.994807 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 17:02:27.994818 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:02:27.994828 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:02:27.994838 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:02:27.994849 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:02:27.994860 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:02:27.994871 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:02:27.994881 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 17:02:27.994891 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 17:02:27.994901 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 17:02:27.994911 kernel: loop: module loaded May 27 17:02:27.994921 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:02:27.994960 systemd-journald[1164]: Collecting audit messages is disabled. May 27 17:02:27.994983 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 17:02:27.994996 systemd-journald[1164]: Journal started May 27 17:02:27.995019 systemd-journald[1164]: Runtime Journal (/run/log/journal/de1d5d1355d74b63b1f0b7f38938bfb8) is 8M, max 76.5M, 68.5M free. May 27 17:02:27.727531 systemd[1]: Queued start job for default target multi-user.target. May 27 17:02:27.740171 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 27 17:02:27.740972 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 17:02:28.000543 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 17:02:28.005399 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 17:02:28.005464 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:02:28.012384 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 17:02:28.021624 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 17:02:28.023419 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:02:28.029679 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 17:02:28.032834 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:02:28.040460 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 17:02:28.040540 kernel: ACPI: bus type drm_connector registered May 27 17:02:28.049753 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:02:28.060140 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 17:02:28.066393 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:02:28.073456 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:02:28.071360 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:02:28.073874 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:02:28.075073 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 17:02:28.076084 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:02:28.076233 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:02:28.077206 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 17:02:28.078908 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 17:02:28.079658 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 17:02:28.112183 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 17:02:28.114538 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:02:28.120706 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 17:02:28.122385 kernel: loop0: detected capacity change from 0 to 107312 May 27 17:02:28.124779 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 17:02:28.137639 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 17:02:28.152769 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 17:02:28.156023 systemd-journald[1164]: Time spent on flushing to /var/log/journal/de1d5d1355d74b63b1f0b7f38938bfb8 is 55.540ms for 1169 entries. May 27 17:02:28.156023 systemd-journald[1164]: System Journal (/var/log/journal/de1d5d1355d74b63b1f0b7f38938bfb8) is 8M, max 584.8M, 576.8M free. May 27 17:02:28.223819 systemd-journald[1164]: Received client request to flush runtime journal. May 27 17:02:28.223859 kernel: loop1: detected capacity change from 0 to 8 May 27 17:02:28.223871 kernel: loop2: detected capacity change from 0 to 211168 May 27 17:02:28.159665 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:02:28.186723 systemd-tmpfiles[1194]: ACLs are not supported, ignoring. May 27 17:02:28.186734 systemd-tmpfiles[1194]: ACLs are not supported, ignoring. May 27 17:02:28.194570 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:02:28.201611 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 17:02:28.206147 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:02:28.209993 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 17:02:28.227182 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 17:02:28.254507 kernel: loop3: detected capacity change from 0 to 138376 May 27 17:02:28.280923 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 17:02:28.289558 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:02:28.305597 kernel: loop4: detected capacity change from 0 to 107312 May 27 17:02:28.322415 kernel: loop5: detected capacity change from 0 to 8 May 27 17:02:28.324399 kernel: loop6: detected capacity change from 0 to 211168 May 27 17:02:28.325693 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. May 27 17:02:28.326088 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. May 27 17:02:28.342420 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:02:28.350746 kernel: loop7: detected capacity change from 0 to 138376 May 27 17:02:28.367407 (sd-merge)[1237]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. May 27 17:02:28.367881 (sd-merge)[1237]: Merged extensions into '/usr'. May 27 17:02:28.376248 systemd[1]: Reload requested from client PID 1193 ('systemd-sysext') (unit systemd-sysext.service)... May 27 17:02:28.376268 systemd[1]: Reloading... May 27 17:02:28.519422 zram_generator::config[1266]: No configuration found. May 27 17:02:28.614565 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:02:28.629850 ldconfig[1186]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 17:02:28.690501 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 17:02:28.690851 systemd[1]: Reloading finished in 314 ms. May 27 17:02:28.713202 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 17:02:28.715592 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 17:02:28.724615 systemd[1]: Starting ensure-sysext.service... May 27 17:02:28.727748 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:02:28.757644 systemd[1]: Reload requested from client PID 1303 ('systemctl') (unit ensure-sysext.service)... May 27 17:02:28.757841 systemd[1]: Reloading... May 27 17:02:28.770635 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 17:02:28.773702 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 17:02:28.774093 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 17:02:28.774425 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 17:02:28.775198 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 17:02:28.777116 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. May 27 17:02:28.777619 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. May 27 17:02:28.783830 systemd-tmpfiles[1304]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:02:28.783955 systemd-tmpfiles[1304]: Skipping /boot May 27 17:02:28.802902 systemd-tmpfiles[1304]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:02:28.804151 systemd-tmpfiles[1304]: Skipping /boot May 27 17:02:28.853420 zram_generator::config[1343]: No configuration found. May 27 17:02:28.935152 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:02:29.009949 systemd[1]: Reloading finished in 251 ms. May 27 17:02:29.031412 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 17:02:29.037298 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:02:29.045561 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:02:29.048819 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 17:02:29.052651 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 17:02:29.062656 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:02:29.066705 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:02:29.072653 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 17:02:29.080019 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:02:29.083763 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:02:29.089824 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:02:29.099682 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:02:29.101020 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:02:29.101227 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:02:29.104064 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 17:02:29.114734 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 17:02:29.118492 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 17:02:29.123048 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:02:29.123246 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:02:29.137599 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 17:02:29.141168 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:02:29.145750 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:02:29.150489 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:02:29.151388 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:02:29.151539 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:02:29.165490 systemd[1]: Finished ensure-sysext.service. May 27 17:02:29.169443 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 17:02:29.171084 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:02:29.171266 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:02:29.173994 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:02:29.174186 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:02:29.177670 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:02:29.178183 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:02:29.182155 systemd-udevd[1375]: Using default interface naming scheme 'v255'. May 27 17:02:29.183274 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:02:29.184889 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:02:29.194991 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 17:02:29.195657 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 17:02:29.196111 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 17:02:29.202889 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:02:29.203275 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:02:29.230183 augenrules[1413]: No rules May 27 17:02:29.232048 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:02:29.234482 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:02:29.240794 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 17:02:29.244049 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:02:29.247323 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:02:29.377303 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 27 17:02:29.542407 kernel: mousedev: PS/2 mouse device common for all mice May 27 17:02:29.597423 systemd-networkd[1427]: lo: Link UP May 27 17:02:29.597719 systemd-networkd[1427]: lo: Gained carrier May 27 17:02:29.600584 systemd-networkd[1427]: Enumeration completed May 27 17:02:29.600835 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:02:29.601245 systemd-networkd[1427]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:29.601399 systemd-networkd[1427]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:02:29.602142 systemd-networkd[1427]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:29.602347 systemd-networkd[1427]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:02:29.603767 systemd-networkd[1427]: eth0: Link UP May 27 17:02:29.603968 systemd-networkd[1427]: eth0: Gained carrier May 27 17:02:29.604055 systemd-networkd[1427]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:29.604646 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 17:02:29.608705 systemd-networkd[1427]: eth1: Link UP May 27 17:02:29.609520 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 17:02:29.610172 systemd-networkd[1427]: eth1: Gained carrier May 27 17:02:29.610264 systemd-networkd[1427]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:02:29.639515 systemd-networkd[1427]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 27 17:02:29.640516 systemd-resolved[1373]: Positive Trust Anchors: May 27 17:02:29.640550 systemd-resolved[1373]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:02:29.640583 systemd-resolved[1373]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:02:29.645236 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 17:02:29.646736 systemd[1]: Reached target time-set.target - System Time Set. May 27 17:02:29.648914 systemd-resolved[1373]: Using system hostname 'ci-4344-0-0-0-39ed1690e8'. May 27 17:02:29.652787 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:02:29.654242 systemd[1]: Reached target network.target - Network. May 27 17:02:29.654836 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:02:29.656198 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:02:29.656895 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 17:02:29.657807 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 17:02:29.658793 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 17:02:29.659606 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 17:02:29.660671 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 17:02:29.661412 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 17:02:29.661456 systemd[1]: Reached target paths.target - Path Units. May 27 17:02:29.662204 systemd[1]: Reached target timers.target - Timer Units. May 27 17:02:29.663629 systemd-networkd[1427]: eth0: DHCPv4 address 91.99.121.210/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 27 17:02:29.663846 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 17:02:29.666690 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 17:02:29.671998 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 17:02:29.672915 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 17:02:29.673926 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 17:02:29.677861 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 17:02:29.679598 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 17:02:29.682446 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 17:02:29.683680 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 17:02:29.684871 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:02:29.688445 systemd[1]: Reached target basic.target - Basic System. May 27 17:02:29.689019 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 17:02:29.689049 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 17:02:29.692526 systemd[1]: Starting containerd.service - containerd container runtime... May 27 17:02:29.695685 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 17:02:29.698701 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 17:02:29.701396 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 17:02:29.707875 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 17:02:29.718279 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 17:02:29.721540 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 17:02:29.722890 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 17:02:29.727775 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 17:02:29.732068 jq[1486]: false May 27 17:02:29.732776 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 17:02:29.734974 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 17:02:29.740701 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 17:02:29.742171 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 17:02:29.751971 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 17:02:29.756670 systemd[1]: Starting update-engine.service - Update Engine... May 27 17:02:29.761645 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 17:02:29.767948 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 17:02:29.769770 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 17:02:29.769978 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 17:02:29.773230 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 17:02:29.775548 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 17:02:29.777442 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. May 27 17:02:29.798304 coreos-metadata[1483]: May 27 17:02:29.795 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 May 27 17:02:29.798484 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. May 27 17:02:29.801652 coreos-metadata[1483]: May 27 17:02:29.800 INFO Fetch successful May 27 17:02:29.804998 coreos-metadata[1483]: May 27 17:02:29.804 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 May 27 17:02:29.804138 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 27 17:02:29.808919 coreos-metadata[1483]: May 27 17:02:29.807 INFO Fetch successful May 27 17:02:29.811006 jq[1496]: true May 27 17:02:29.811474 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 17:02:29.818658 extend-filesystems[1487]: Found loop4 May 27 17:02:29.823562 extend-filesystems[1487]: Found loop5 May 27 17:02:29.823562 extend-filesystems[1487]: Found loop6 May 27 17:02:29.823562 extend-filesystems[1487]: Found loop7 May 27 17:02:29.823562 extend-filesystems[1487]: Found sda May 27 17:02:29.823562 extend-filesystems[1487]: Found sda1 May 27 17:02:29.823562 extend-filesystems[1487]: Found sda2 May 27 17:02:29.823562 extend-filesystems[1487]: Found sda3 May 27 17:02:29.823562 extend-filesystems[1487]: Found usr May 27 17:02:29.823562 extend-filesystems[1487]: Found sda4 May 27 17:02:29.823562 extend-filesystems[1487]: Found sda6 May 27 17:02:29.823562 extend-filesystems[1487]: Found sda7 May 27 17:02:29.823562 extend-filesystems[1487]: Found sda9 May 27 17:02:29.857572 extend-filesystems[1487]: Checking size of /dev/sda9 May 27 17:02:29.863336 systemd[1]: motdgen.service: Deactivated successfully. May 27 17:02:29.866284 (ntainerd)[1514]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 17:02:29.867553 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 17:02:29.873581 tar[1498]: linux-arm64/LICENSE May 27 17:02:29.873581 tar[1498]: linux-arm64/helm May 27 17:02:29.881770 update_engine[1495]: I20250527 17:02:29.881594 1495 main.cc:92] Flatcar Update Engine starting May 27 17:02:29.887250 dbus-daemon[1484]: [system] SELinux support is enabled May 27 17:02:29.888417 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 17:02:29.894144 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 17:02:29.894188 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 17:02:29.897081 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 17:02:29.897109 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 17:02:29.902622 systemd-timesyncd[1406]: Contacted time server 185.233.107.180:123 (3.flatcar.pool.ntp.org). May 27 17:02:29.902787 systemd-timesyncd[1406]: Initial clock synchronization to Tue 2025-05-27 17:02:29.878253 UTC. May 27 17:02:29.907962 systemd[1]: Started update-engine.service - Update Engine. May 27 17:02:29.910296 update_engine[1495]: I20250527 17:02:29.910225 1495 update_check_scheduler.cc:74] Next update check in 8m4s May 27 17:02:29.915384 jq[1521]: true May 27 17:02:29.917887 extend-filesystems[1487]: Resized partition /dev/sda9 May 27 17:02:29.926238 extend-filesystems[1536]: resize2fs 1.47.2 (1-Jan-2025) May 27 17:02:29.919631 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 17:02:29.927964 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 17:02:29.948947 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks May 27 17:02:30.105447 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 May 27 17:02:30.111447 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 17:02:30.112962 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 17:02:30.114406 kernel: EXT4-fs (sda9): resized filesystem to 9393147 May 27 17:02:30.115631 systemd-logind[1494]: New seat seat0. May 27 17:02:30.119563 bash[1567]: Updated "/home/core/.ssh/authorized_keys" May 27 17:02:30.121279 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 17:02:30.128658 systemd[1]: Starting sshkeys.service... May 27 17:02:30.131675 systemd[1]: Started systemd-logind.service - User Login Management. May 27 17:02:30.141895 extend-filesystems[1536]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required May 27 17:02:30.141895 extend-filesystems[1536]: old_desc_blocks = 1, new_desc_blocks = 5 May 27 17:02:30.141895 extend-filesystems[1536]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. May 27 17:02:30.151880 extend-filesystems[1487]: Resized filesystem in /dev/sda9 May 27 17:02:30.151880 extend-filesystems[1487]: Found sr0 May 27 17:02:30.157906 containerd[1514]: time="2025-05-27T17:02:30Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 17:02:30.161382 containerd[1514]: time="2025-05-27T17:02:30.158482072Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 17:02:30.203387 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 27 17:02:30.203492 kernel: [drm] features: -context_init May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.204667784Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.584µs" May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.204713475Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.204733884Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.204899432Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.204913891Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.204937495Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.204992531Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.205004313Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.205261244Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.205277659Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.205297309Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:02:30.206388 containerd[1514]: time="2025-05-27T17:02:30.205306415Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 17:02:30.207553 containerd[1514]: time="2025-05-27T17:02:30.207508915Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 17:02:30.208143 containerd[1514]: time="2025-05-27T17:02:30.207752905Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:02:30.208143 containerd[1514]: time="2025-05-27T17:02:30.207796479Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:02:30.208143 containerd[1514]: time="2025-05-27T17:02:30.207820243Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 17:02:30.208143 containerd[1514]: time="2025-05-27T17:02:30.207860781Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 17:02:30.208243 containerd[1514]: time="2025-05-27T17:02:30.208183891Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 17:02:30.208335 containerd[1514]: time="2025-05-27T17:02:30.208308183Z" level=info msg="metadata content store policy set" policy=shared May 27 17:02:30.213374 kernel: [drm] number of scanouts: 1 May 27 17:02:30.213449 kernel: [drm] number of cap sets: 0 May 27 17:02:30.214761 containerd[1514]: time="2025-05-27T17:02:30.214694856Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 17:02:30.214833 containerd[1514]: time="2025-05-27T17:02:30.214786158Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 17:02:30.214833 containerd[1514]: time="2025-05-27T17:02:30.214801255Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 17:02:30.214833 containerd[1514]: time="2025-05-27T17:02:30.214813237Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 17:02:30.214833 containerd[1514]: time="2025-05-27T17:02:30.214826936Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 17:02:30.214926 containerd[1514]: time="2025-05-27T17:02:30.214840036Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 17:02:30.214926 containerd[1514]: time="2025-05-27T17:02:30.214852817Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 17:02:30.214926 containerd[1514]: time="2025-05-27T17:02:30.214865198Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 17:02:30.214926 containerd[1514]: time="2025-05-27T17:02:30.214877939Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 17:02:30.214926 containerd[1514]: time="2025-05-27T17:02:30.214888403Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 17:02:30.214926 containerd[1514]: time="2025-05-27T17:02:30.214897948Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 17:02:30.214926 containerd[1514]: time="2025-05-27T17:02:30.214917798Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 17:02:30.215147 containerd[1514]: time="2025-05-27T17:02:30.215061900Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 17:02:30.215147 containerd[1514]: time="2025-05-27T17:02:30.215088579Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 17:02:30.215147 containerd[1514]: time="2025-05-27T17:02:30.215114779Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 17:02:30.215147 containerd[1514]: time="2025-05-27T17:02:30.215129477Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 17:02:30.215147 containerd[1514]: time="2025-05-27T17:02:30.215141020Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 17:02:30.215147 containerd[1514]: time="2025-05-27T17:02:30.215151244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 17:02:30.215284 containerd[1514]: time="2025-05-27T17:02:30.215162986Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 17:02:30.215284 containerd[1514]: time="2025-05-27T17:02:30.215173410Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 17:02:30.215284 containerd[1514]: time="2025-05-27T17:02:30.215184314Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 17:02:30.215284 containerd[1514]: time="2025-05-27T17:02:30.215194738Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 17:02:30.215284 containerd[1514]: time="2025-05-27T17:02:30.215204124Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 17:02:30.215456 containerd[1514]: time="2025-05-27T17:02:30.215416602Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 17:02:30.215456 containerd[1514]: time="2025-05-27T17:02:30.215439886Z" level=info msg="Start snapshots syncer" May 27 17:02:30.215502 containerd[1514]: time="2025-05-27T17:02:30.215465248Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 17:02:30.216062 containerd[1514]: time="2025-05-27T17:02:30.216004230Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 17:02:30.216203 containerd[1514]: time="2025-05-27T17:02:30.216077679Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 17:02:30.216203 containerd[1514]: time="2025-05-27T17:02:30.216180683Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 17:02:30.216352 containerd[1514]: time="2025-05-27T17:02:30.216325064Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 17:02:30.216395 containerd[1514]: time="2025-05-27T17:02:30.216355538Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 17:02:30.216414 containerd[1514]: time="2025-05-27T17:02:30.216397354Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 17:02:30.216444 containerd[1514]: time="2025-05-27T17:02:30.216416605Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 17:02:30.216444 containerd[1514]: time="2025-05-27T17:02:30.216429585Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 17:02:30.216444 containerd[1514]: time="2025-05-27T17:02:30.216439330Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 17:02:30.216492 containerd[1514]: time="2025-05-27T17:02:30.216450394Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 17:02:30.216492 containerd[1514]: time="2025-05-27T17:02:30.216482185Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 17:02:30.216523 containerd[1514]: time="2025-05-27T17:02:30.216493688Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 17:02:30.216523 containerd[1514]: time="2025-05-27T17:02:30.216504472Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 17:02:30.216559 containerd[1514]: time="2025-05-27T17:02:30.216541016Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:02:30.216576 containerd[1514]: time="2025-05-27T17:02:30.216555914Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:02:30.216576 containerd[1514]: time="2025-05-27T17:02:30.216564700Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:02:30.216609 containerd[1514]: time="2025-05-27T17:02:30.216574565Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:02:30.216609 containerd[1514]: time="2025-05-27T17:02:30.216582274Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 17:02:30.216609 containerd[1514]: time="2025-05-27T17:02:30.216592179Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 17:02:30.216609 containerd[1514]: time="2025-05-27T17:02:30.216602483Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 17:02:30.218395 containerd[1514]: time="2025-05-27T17:02:30.216679326Z" level=info msg="runtime interface created" May 27 17:02:30.218395 containerd[1514]: time="2025-05-27T17:02:30.216690030Z" level=info msg="created NRI interface" May 27 17:02:30.218395 containerd[1514]: time="2025-05-27T17:02:30.216699416Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 17:02:30.218395 containerd[1514]: time="2025-05-27T17:02:30.216713275Z" level=info msg="Connect containerd service" May 27 17:02:30.218395 containerd[1514]: time="2025-05-27T17:02:30.216746904Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 17:02:30.218395 containerd[1514]: time="2025-05-27T17:02:30.217467132Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 17:02:30.217415 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 17:02:30.229025 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 17:02:30.264053 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 17:02:30.271858 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 17:02:30.304021 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 May 27 17:02:30.370530 systemd-logind[1494]: Watching system buttons on /dev/input/event0 (Power Button) May 27 17:02:30.398666 coreos-metadata[1584]: May 27 17:02:30.398 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 May 27 17:02:30.403103 coreos-metadata[1584]: May 27 17:02:30.402 INFO Fetch successful May 27 17:02:30.405043 unknown[1584]: wrote ssh authorized keys file for user: core May 27 17:02:30.473467 update-ssh-keys[1597]: Updated "/home/core/.ssh/authorized_keys" May 27 17:02:30.470912 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 17:02:30.476227 systemd[1]: Finished sshkeys.service. May 27 17:02:30.486677 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:02:30.487343 containerd[1514]: time="2025-05-27T17:02:30.487176555Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 17:02:30.487343 containerd[1514]: time="2025-05-27T17:02:30.487240897Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 17:02:30.487343 containerd[1514]: time="2025-05-27T17:02:30.487263703Z" level=info msg="Start subscribing containerd event" May 27 17:02:30.487343 containerd[1514]: time="2025-05-27T17:02:30.487300407Z" level=info msg="Start recovering state" May 27 17:02:30.488296 containerd[1514]: time="2025-05-27T17:02:30.488262708Z" level=info msg="Start event monitor" May 27 17:02:30.488296 containerd[1514]: time="2025-05-27T17:02:30.488294939Z" level=info msg="Start cni network conf syncer for default" May 27 17:02:30.488459 containerd[1514]: time="2025-05-27T17:02:30.488304924Z" level=info msg="Start streaming server" May 27 17:02:30.488459 containerd[1514]: time="2025-05-27T17:02:30.488313870Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 17:02:30.488459 containerd[1514]: time="2025-05-27T17:02:30.488320859Z" level=info msg="runtime interface starting up..." May 27 17:02:30.488459 containerd[1514]: time="2025-05-27T17:02:30.488326331Z" level=info msg="starting plugins..." May 27 17:02:30.488459 containerd[1514]: time="2025-05-27T17:02:30.488341069Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 17:02:30.493119 containerd[1514]: time="2025-05-27T17:02:30.491337324Z" level=info msg="containerd successfully booted in 0.335792s" May 27 17:02:30.491924 systemd[1]: Started containerd.service - containerd container runtime. May 27 17:02:30.514669 systemd-logind[1494]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) May 27 17:02:30.610686 locksmithd[1535]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 17:02:30.656854 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:02:30.879751 tar[1498]: linux-arm64/README.md May 27 17:02:30.898781 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 17:02:31.009541 systemd-networkd[1427]: eth0: Gained IPv6LL May 27 17:02:31.012630 systemd-networkd[1427]: eth1: Gained IPv6LL May 27 17:02:31.016029 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 17:02:31.017841 systemd[1]: Reached target network-online.target - Network is Online. May 27 17:02:31.021569 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:02:31.025723 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 17:02:31.080544 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 17:02:31.104242 sshd_keygen[1513]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 17:02:31.139449 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 17:02:31.144336 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 17:02:31.171210 systemd[1]: issuegen.service: Deactivated successfully. May 27 17:02:31.172195 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 17:02:31.177866 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 17:02:31.202867 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 17:02:31.207750 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 17:02:31.211708 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 27 17:02:31.212588 systemd[1]: Reached target getty.target - Login Prompts. May 27 17:02:31.906669 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:02:31.908347 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 17:02:31.910970 systemd[1]: Startup finished in 2.426s (kernel) + 5.609s (initrd) + 4.737s (userspace) = 12.773s. May 27 17:02:31.923662 (kubelet)[1648]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:02:32.461458 kubelet[1648]: E0527 17:02:32.461390 1648 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:02:32.463931 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:02:32.464065 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:02:32.464449 systemd[1]: kubelet.service: Consumed 920ms CPU time, 259.4M memory peak. May 27 17:02:42.715749 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 17:02:42.719034 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:02:42.879156 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:02:42.885877 (kubelet)[1667]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:02:42.926609 kubelet[1667]: E0527 17:02:42.926562 1667 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:02:42.931759 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:02:42.931962 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:02:42.933538 systemd[1]: kubelet.service: Consumed 162ms CPU time, 107.2M memory peak. May 27 17:02:53.182926 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 17:02:53.186470 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:02:53.354458 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:02:53.362948 (kubelet)[1681]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:02:53.417666 kubelet[1681]: E0527 17:02:53.417591 1681 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:02:53.420782 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:02:53.421040 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:02:53.421655 systemd[1]: kubelet.service: Consumed 174ms CPU time, 107.4M memory peak. May 27 17:03:03.541272 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 17:03:03.545024 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:03:03.712510 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:03:03.723966 (kubelet)[1696]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:03:03.779604 kubelet[1696]: E0527 17:03:03.779518 1696 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:03:03.783098 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:03:03.783242 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:03:03.783935 systemd[1]: kubelet.service: Consumed 178ms CPU time, 106.3M memory peak. May 27 17:03:13.791183 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 27 17:03:13.794423 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:03:13.979674 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:03:13.992921 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:03:14.036105 kubelet[1711]: E0527 17:03:14.036038 1711 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:03:14.040086 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:03:14.040581 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:03:14.043457 systemd[1]: kubelet.service: Consumed 172ms CPU time, 104.7M memory peak. May 27 17:03:15.171763 update_engine[1495]: I20250527 17:03:15.171612 1495 update_attempter.cc:509] Updating boot flags... May 27 17:03:24.290591 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 27 17:03:24.292829 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:03:24.485058 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:03:24.501006 (kubelet)[1746]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:03:24.542435 kubelet[1746]: E0527 17:03:24.542243 1746 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:03:24.545087 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:03:24.545341 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:03:24.546010 systemd[1]: kubelet.service: Consumed 176ms CPU time, 104.6M memory peak. May 27 17:03:34.791023 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 27 17:03:34.793467 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:03:34.977922 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:03:34.994066 (kubelet)[1761]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:03:35.042510 kubelet[1761]: E0527 17:03:35.042355 1761 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:03:35.045352 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:03:35.045649 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:03:35.046170 systemd[1]: kubelet.service: Consumed 182ms CPU time, 104.6M memory peak. May 27 17:03:45.291315 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 27 17:03:45.295136 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:03:45.451652 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:03:45.463970 (kubelet)[1776]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:03:45.505697 kubelet[1776]: E0527 17:03:45.505628 1776 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:03:45.509227 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:03:45.509589 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:03:45.510265 systemd[1]: kubelet.service: Consumed 163ms CPU time, 104.6M memory peak. May 27 17:03:55.541573 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 27 17:03:55.545029 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:03:55.724976 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:03:55.738976 (kubelet)[1791]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:03:55.779825 kubelet[1791]: E0527 17:03:55.779755 1791 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:03:55.782814 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:03:55.783091 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:03:55.783654 systemd[1]: kubelet.service: Consumed 167ms CPU time, 107M memory peak. May 27 17:04:05.790740 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 27 17:04:05.793559 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:05.976615 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:05.984909 (kubelet)[1804]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:04:06.026073 kubelet[1804]: E0527 17:04:06.026012 1804 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:04:06.028772 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:04:06.028904 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:04:06.029471 systemd[1]: kubelet.service: Consumed 159ms CPU time, 106.1M memory peak. May 27 17:04:10.200480 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 17:04:10.203811 systemd[1]: Started sshd@0-91.99.121.210:22-139.178.89.65:56878.service - OpenSSH per-connection server daemon (139.178.89.65:56878). May 27 17:04:11.227088 sshd[1812]: Accepted publickey for core from 139.178.89.65 port 56878 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:04:11.229712 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:04:11.243953 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 17:04:11.248682 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 17:04:11.251254 systemd-logind[1494]: New session 1 of user core. May 27 17:04:11.281421 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 17:04:11.284761 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 17:04:11.301944 (systemd)[1816]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 17:04:11.305270 systemd-logind[1494]: New session c1 of user core. May 27 17:04:11.453907 systemd[1816]: Queued start job for default target default.target. May 27 17:04:11.463605 systemd[1816]: Created slice app.slice - User Application Slice. May 27 17:04:11.463685 systemd[1816]: Reached target paths.target - Paths. May 27 17:04:11.463758 systemd[1816]: Reached target timers.target - Timers. May 27 17:04:11.466328 systemd[1816]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 17:04:11.506763 systemd[1816]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 17:04:11.507135 systemd[1816]: Reached target sockets.target - Sockets. May 27 17:04:11.507381 systemd[1816]: Reached target basic.target - Basic System. May 27 17:04:11.507587 systemd[1816]: Reached target default.target - Main User Target. May 27 17:04:11.507683 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 17:04:11.507820 systemd[1816]: Startup finished in 194ms. May 27 17:04:11.515701 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 17:04:12.222083 systemd[1]: Started sshd@1-91.99.121.210:22-139.178.89.65:56880.service - OpenSSH per-connection server daemon (139.178.89.65:56880). May 27 17:04:13.244966 sshd[1827]: Accepted publickey for core from 139.178.89.65 port 56880 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:04:13.247089 sshd-session[1827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:04:13.252591 systemd-logind[1494]: New session 2 of user core. May 27 17:04:13.261475 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 17:04:13.943884 sshd[1829]: Connection closed by 139.178.89.65 port 56880 May 27 17:04:13.944803 sshd-session[1827]: pam_unix(sshd:session): session closed for user core May 27 17:04:13.950087 systemd[1]: sshd@1-91.99.121.210:22-139.178.89.65:56880.service: Deactivated successfully. May 27 17:04:13.953228 systemd[1]: session-2.scope: Deactivated successfully. May 27 17:04:13.954630 systemd-logind[1494]: Session 2 logged out. Waiting for processes to exit. May 27 17:04:13.956768 systemd-logind[1494]: Removed session 2. May 27 17:04:14.117190 systemd[1]: Started sshd@2-91.99.121.210:22-139.178.89.65:35664.service - OpenSSH per-connection server daemon (139.178.89.65:35664). May 27 17:04:15.124616 sshd[1835]: Accepted publickey for core from 139.178.89.65 port 35664 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:04:15.127214 sshd-session[1835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:04:15.133399 systemd-logind[1494]: New session 3 of user core. May 27 17:04:15.144873 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 17:04:15.808530 sshd[1837]: Connection closed by 139.178.89.65 port 35664 May 27 17:04:15.809648 sshd-session[1835]: pam_unix(sshd:session): session closed for user core May 27 17:04:15.815811 systemd[1]: sshd@2-91.99.121.210:22-139.178.89.65:35664.service: Deactivated successfully. May 27 17:04:15.819817 systemd[1]: session-3.scope: Deactivated successfully. May 27 17:04:15.821041 systemd-logind[1494]: Session 3 logged out. Waiting for processes to exit. May 27 17:04:15.824633 systemd-logind[1494]: Removed session 3. May 27 17:04:15.986078 systemd[1]: Started sshd@3-91.99.121.210:22-139.178.89.65:35674.service - OpenSSH per-connection server daemon (139.178.89.65:35674). May 27 17:04:16.041044 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. May 27 17:04:16.043561 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:16.249704 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:16.257988 (kubelet)[1853]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:04:16.301859 kubelet[1853]: E0527 17:04:16.301809 1853 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:04:16.305700 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:04:16.305854 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:04:16.306540 systemd[1]: kubelet.service: Consumed 174ms CPU time, 106.8M memory peak. May 27 17:04:16.990198 sshd[1843]: Accepted publickey for core from 139.178.89.65 port 35674 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:04:16.992824 sshd-session[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:04:17.000707 systemd-logind[1494]: New session 4 of user core. May 27 17:04:17.008701 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 17:04:17.675597 sshd[1860]: Connection closed by 139.178.89.65 port 35674 May 27 17:04:17.676456 sshd-session[1843]: pam_unix(sshd:session): session closed for user core May 27 17:04:17.681078 systemd[1]: sshd@3-91.99.121.210:22-139.178.89.65:35674.service: Deactivated successfully. May 27 17:04:17.683138 systemd[1]: session-4.scope: Deactivated successfully. May 27 17:04:17.685327 systemd-logind[1494]: Session 4 logged out. Waiting for processes to exit. May 27 17:04:17.687819 systemd-logind[1494]: Removed session 4. May 27 17:04:17.850880 systemd[1]: Started sshd@4-91.99.121.210:22-139.178.89.65:35686.service - OpenSSH per-connection server daemon (139.178.89.65:35686). May 27 17:04:18.853740 sshd[1866]: Accepted publickey for core from 139.178.89.65 port 35686 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:04:18.855951 sshd-session[1866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:04:18.861291 systemd-logind[1494]: New session 5 of user core. May 27 17:04:18.876735 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 17:04:19.382257 sudo[1869]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 17:04:19.382557 sudo[1869]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:04:19.400123 sudo[1869]: pam_unix(sudo:session): session closed for user root May 27 17:04:19.559098 sshd[1868]: Connection closed by 139.178.89.65 port 35686 May 27 17:04:19.560519 sshd-session[1866]: pam_unix(sshd:session): session closed for user core May 27 17:04:19.566629 systemd[1]: sshd@4-91.99.121.210:22-139.178.89.65:35686.service: Deactivated successfully. May 27 17:04:19.570053 systemd[1]: session-5.scope: Deactivated successfully. May 27 17:04:19.572495 systemd-logind[1494]: Session 5 logged out. Waiting for processes to exit. May 27 17:04:19.574815 systemd-logind[1494]: Removed session 5. May 27 17:04:19.734922 systemd[1]: Started sshd@5-91.99.121.210:22-139.178.89.65:35692.service - OpenSSH per-connection server daemon (139.178.89.65:35692). May 27 17:04:20.740866 sshd[1875]: Accepted publickey for core from 139.178.89.65 port 35692 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:04:20.745086 sshd-session[1875]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:04:20.750830 systemd-logind[1494]: New session 6 of user core. May 27 17:04:20.760782 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 17:04:21.262937 sudo[1879]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 17:04:21.263887 sudo[1879]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:04:21.269823 sudo[1879]: pam_unix(sudo:session): session closed for user root May 27 17:04:21.276529 sudo[1878]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 17:04:21.276819 sudo[1878]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:04:21.291140 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:04:21.349299 augenrules[1901]: No rules May 27 17:04:21.350917 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:04:21.352451 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:04:21.355489 sudo[1878]: pam_unix(sudo:session): session closed for user root May 27 17:04:21.515108 sshd[1877]: Connection closed by 139.178.89.65 port 35692 May 27 17:04:21.515617 sshd-session[1875]: pam_unix(sshd:session): session closed for user core May 27 17:04:21.519581 systemd-logind[1494]: Session 6 logged out. Waiting for processes to exit. May 27 17:04:21.519689 systemd[1]: sshd@5-91.99.121.210:22-139.178.89.65:35692.service: Deactivated successfully. May 27 17:04:21.522198 systemd[1]: session-6.scope: Deactivated successfully. May 27 17:04:21.524764 systemd-logind[1494]: Removed session 6. May 27 17:04:21.686432 systemd[1]: Started sshd@6-91.99.121.210:22-139.178.89.65:35706.service - OpenSSH per-connection server daemon (139.178.89.65:35706). May 27 17:04:22.692281 sshd[1910]: Accepted publickey for core from 139.178.89.65 port 35706 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:04:22.694190 sshd-session[1910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:04:22.699564 systemd-logind[1494]: New session 7 of user core. May 27 17:04:22.707700 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 17:04:23.220558 sudo[1913]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 17:04:23.220840 sudo[1913]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:04:23.576795 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 17:04:23.589544 (dockerd)[1931]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 17:04:23.839344 dockerd[1931]: time="2025-05-27T17:04:23.838815485Z" level=info msg="Starting up" May 27 17:04:23.842204 dockerd[1931]: time="2025-05-27T17:04:23.842147163Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 17:04:23.894063 dockerd[1931]: time="2025-05-27T17:04:23.893726297Z" level=info msg="Loading containers: start." May 27 17:04:23.906409 kernel: Initializing XFRM netlink socket May 27 17:04:24.137462 systemd-networkd[1427]: docker0: Link UP May 27 17:04:24.143739 dockerd[1931]: time="2025-05-27T17:04:24.143637133Z" level=info msg="Loading containers: done." May 27 17:04:24.162202 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck612225240-merged.mount: Deactivated successfully. May 27 17:04:24.164683 dockerd[1931]: time="2025-05-27T17:04:24.164620163Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 17:04:24.164792 dockerd[1931]: time="2025-05-27T17:04:24.164718842Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 17:04:24.164926 dockerd[1931]: time="2025-05-27T17:04:24.164841922Z" level=info msg="Initializing buildkit" May 27 17:04:24.191016 dockerd[1931]: time="2025-05-27T17:04:24.190959630Z" level=info msg="Completed buildkit initialization" May 27 17:04:24.202048 dockerd[1931]: time="2025-05-27T17:04:24.201958424Z" level=info msg="Daemon has completed initialization" May 27 17:04:24.202630 dockerd[1931]: time="2025-05-27T17:04:24.202479744Z" level=info msg="API listen on /run/docker.sock" May 27 17:04:24.203309 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 17:04:25.070272 containerd[1514]: time="2025-05-27T17:04:25.070201997Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\"" May 27 17:04:25.718655 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1800507479.mount: Deactivated successfully. May 27 17:04:26.541097 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. May 27 17:04:26.545163 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:26.709491 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:26.721731 (kubelet)[2198]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:04:26.779034 kubelet[2198]: E0527 17:04:26.778968 2198 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:04:26.781711 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:04:26.781849 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:04:26.784477 systemd[1]: kubelet.service: Consumed 165ms CPU time, 107M memory peak. May 27 17:04:27.207582 containerd[1514]: time="2025-05-27T17:04:27.207525176Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:27.211630 containerd[1514]: time="2025-05-27T17:04:27.211586814Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.1: active requests=0, bytes read=27349442" May 27 17:04:27.213733 containerd[1514]: time="2025-05-27T17:04:27.213648253Z" level=info msg="ImageCreate event name:\"sha256:9a2b7cf4f8540534c6ec5b758462c6d7885c6e734652172078bba899c0e3089a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:27.223405 containerd[1514]: time="2025-05-27T17:04:27.222913729Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:27.226180 containerd[1514]: time="2025-05-27T17:04:27.226132288Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.1\" with image id \"sha256:9a2b7cf4f8540534c6ec5b758462c6d7885c6e734652172078bba899c0e3089a\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\", size \"27346150\" in 2.155874491s" May 27 17:04:27.226393 containerd[1514]: time="2025-05-27T17:04:27.226343967Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\" returns image reference \"sha256:9a2b7cf4f8540534c6ec5b758462c6d7885c6e734652172078bba899c0e3089a\"" May 27 17:04:27.228389 containerd[1514]: time="2025-05-27T17:04:27.228347087Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\"" May 27 17:04:29.768646 containerd[1514]: time="2025-05-27T17:04:29.768524247Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:29.770123 containerd[1514]: time="2025-05-27T17:04:29.769879446Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.1: active requests=0, bytes read=23531755" May 27 17:04:29.770907 containerd[1514]: time="2025-05-27T17:04:29.770869326Z" level=info msg="ImageCreate event name:\"sha256:674996a72aa5900cbbbcd410437021fa4c62a7f829a56f58eb23ac430f2ae383\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:29.773843 containerd[1514]: time="2025-05-27T17:04:29.773799085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:29.775640 containerd[1514]: time="2025-05-27T17:04:29.775378644Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.1\" with image id \"sha256:674996a72aa5900cbbbcd410437021fa4c62a7f829a56f58eb23ac430f2ae383\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\", size \"25086427\" in 2.546965717s" May 27 17:04:29.775640 containerd[1514]: time="2025-05-27T17:04:29.775426484Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\" returns image reference \"sha256:674996a72aa5900cbbbcd410437021fa4c62a7f829a56f58eb23ac430f2ae383\"" May 27 17:04:29.778543 containerd[1514]: time="2025-05-27T17:04:29.778504443Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\"" May 27 17:04:31.457701 containerd[1514]: time="2025-05-27T17:04:31.457607266Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:31.459975 containerd[1514]: time="2025-05-27T17:04:31.459910025Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.1: active requests=0, bytes read=18293751" May 27 17:04:31.462350 containerd[1514]: time="2025-05-27T17:04:31.460699305Z" level=info msg="ImageCreate event name:\"sha256:014094c90caacf743dc5fb4281363492da1df31cd8218aeceab3be3326277d2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:31.463712 containerd[1514]: time="2025-05-27T17:04:31.463668624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:31.464891 containerd[1514]: time="2025-05-27T17:04:31.464859423Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.1\" with image id \"sha256:014094c90caacf743dc5fb4281363492da1df31cd8218aeceab3be3326277d2e\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\", size \"19848441\" in 1.68629346s" May 27 17:04:31.464993 containerd[1514]: time="2025-05-27T17:04:31.464979183Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\" returns image reference \"sha256:014094c90caacf743dc5fb4281363492da1df31cd8218aeceab3be3326277d2e\"" May 27 17:04:31.465794 containerd[1514]: time="2025-05-27T17:04:31.465765183Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\"" May 27 17:04:32.656952 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1984924674.mount: Deactivated successfully. May 27 17:04:33.024599 containerd[1514]: time="2025-05-27T17:04:33.024464838Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:33.026180 containerd[1514]: time="2025-05-27T17:04:33.026117078Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.1: active requests=0, bytes read=28196030" May 27 17:04:33.027613 containerd[1514]: time="2025-05-27T17:04:33.027492917Z" level=info msg="ImageCreate event name:\"sha256:3e58848989f556e36aa29d7852ab1712163960651e074d11cae9d31fb27192db\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:33.033478 containerd[1514]: time="2025-05-27T17:04:33.033436755Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:33.034852 containerd[1514]: time="2025-05-27T17:04:33.034779474Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.1\" with image id \"sha256:3e58848989f556e36aa29d7852ab1712163960651e074d11cae9d31fb27192db\", repo tag \"registry.k8s.io/kube-proxy:v1.33.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\", size \"28195023\" in 1.568979051s" May 27 17:04:33.034852 containerd[1514]: time="2025-05-27T17:04:33.034838674Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\" returns image reference \"sha256:3e58848989f556e36aa29d7852ab1712163960651e074d11cae9d31fb27192db\"" May 27 17:04:33.035421 containerd[1514]: time="2025-05-27T17:04:33.035398874Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" May 27 17:04:33.603668 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3062187836.mount: Deactivated successfully. May 27 17:04:34.563539 containerd[1514]: time="2025-05-27T17:04:34.563493122Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:34.565576 containerd[1514]: time="2025-05-27T17:04:34.565526561Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" May 27 17:04:34.567395 containerd[1514]: time="2025-05-27T17:04:34.567027480Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:34.571664 containerd[1514]: time="2025-05-27T17:04:34.571587039Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:34.572598 containerd[1514]: time="2025-05-27T17:04:34.572546838Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.537115684s" May 27 17:04:34.572741 containerd[1514]: time="2025-05-27T17:04:34.572719278Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" May 27 17:04:34.574803 containerd[1514]: time="2025-05-27T17:04:34.574479717Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 17:04:35.051657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1526175309.mount: Deactivated successfully. May 27 17:04:35.058206 containerd[1514]: time="2025-05-27T17:04:35.057540560Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:04:35.058411 containerd[1514]: time="2025-05-27T17:04:35.058390440Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" May 27 17:04:35.058931 containerd[1514]: time="2025-05-27T17:04:35.058906680Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:04:35.061474 containerd[1514]: time="2025-05-27T17:04:35.061430239Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:04:35.062485 containerd[1514]: time="2025-05-27T17:04:35.062451718Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 487.921361ms" May 27 17:04:35.062485 containerd[1514]: time="2025-05-27T17:04:35.062485558Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 27 17:04:35.062984 containerd[1514]: time="2025-05-27T17:04:35.062955118Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" May 27 17:04:36.790918 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. May 27 17:04:36.793444 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:36.964447 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:36.977284 (kubelet)[2289]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:04:37.033415 kubelet[2289]: E0527 17:04:37.033165 2289 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:04:37.035797 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:04:37.035974 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:04:37.036349 systemd[1]: kubelet.service: Consumed 179ms CPU time, 105.2M memory peak. May 27 17:04:38.756616 containerd[1514]: time="2025-05-27T17:04:38.756481269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:38.759057 containerd[1514]: time="2025-05-27T17:04:38.758528988Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69230195" May 27 17:04:38.760132 containerd[1514]: time="2025-05-27T17:04:38.760066867Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:38.764444 containerd[1514]: time="2025-05-27T17:04:38.764384306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:04:38.766219 containerd[1514]: time="2025-05-27T17:04:38.766171945Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 3.703183027s" May 27 17:04:38.766389 containerd[1514]: time="2025-05-27T17:04:38.766355905Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" May 27 17:04:44.001528 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:44.001823 systemd[1]: kubelet.service: Consumed 179ms CPU time, 105.2M memory peak. May 27 17:04:44.005819 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:44.042433 systemd[1]: Reload requested from client PID 2331 ('systemctl') (unit session-7.scope)... May 27 17:04:44.042456 systemd[1]: Reloading... May 27 17:04:44.193511 zram_generator::config[2375]: No configuration found. May 27 17:04:44.281586 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:04:44.391630 systemd[1]: Reloading finished in 348 ms. May 27 17:04:44.450179 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 17:04:44.450671 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 17:04:44.451381 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:44.451650 systemd[1]: kubelet.service: Consumed 110ms CPU time, 94.8M memory peak. May 27 17:04:44.455909 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:44.634430 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:44.654980 (kubelet)[2423]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:04:44.699068 kubelet[2423]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:04:44.699502 kubelet[2423]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 17:04:44.699502 kubelet[2423]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:04:44.699502 kubelet[2423]: I0527 17:04:44.699475 2423 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:04:44.984929 kubelet[2423]: I0527 17:04:44.984774 2423 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 17:04:44.984929 kubelet[2423]: I0527 17:04:44.984825 2423 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:04:44.985511 kubelet[2423]: I0527 17:04:44.985230 2423 server.go:956] "Client rotation is on, will bootstrap in background" May 27 17:04:45.019347 kubelet[2423]: E0527 17:04:45.019281 2423 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://91.99.121.210:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.121.210:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" May 27 17:04:45.021218 kubelet[2423]: I0527 17:04:45.021047 2423 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:04:45.033500 kubelet[2423]: I0527 17:04:45.033455 2423 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:04:45.040164 kubelet[2423]: I0527 17:04:45.040130 2423 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:04:45.041718 kubelet[2423]: I0527 17:04:45.041645 2423 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:04:45.041964 kubelet[2423]: I0527 17:04:45.041703 2423 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-0-0-0-39ed1690e8","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:04:45.042058 kubelet[2423]: I0527 17:04:45.042017 2423 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:04:45.042058 kubelet[2423]: I0527 17:04:45.042029 2423 container_manager_linux.go:303] "Creating device plugin manager" May 27 17:04:45.042257 kubelet[2423]: I0527 17:04:45.042225 2423 state_mem.go:36] "Initialized new in-memory state store" May 27 17:04:45.045716 kubelet[2423]: I0527 17:04:45.045533 2423 kubelet.go:480] "Attempting to sync node with API server" May 27 17:04:45.045716 kubelet[2423]: I0527 17:04:45.045563 2423 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:04:45.045716 kubelet[2423]: I0527 17:04:45.045587 2423 kubelet.go:386] "Adding apiserver pod source" May 27 17:04:45.048329 kubelet[2423]: I0527 17:04:45.047835 2423 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:04:45.051729 kubelet[2423]: E0527 17:04:45.051693 2423 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://91.99.121.210:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-0-0-0-39ed1690e8&limit=500&resourceVersion=0\": dial tcp 91.99.121.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 27 17:04:45.052274 kubelet[2423]: I0527 17:04:45.052255 2423 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:04:45.053134 kubelet[2423]: I0527 17:04:45.053108 2423 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 17:04:45.053331 kubelet[2423]: W0527 17:04:45.053320 2423 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 17:04:45.057603 kubelet[2423]: E0527 17:04:45.057464 2423 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://91.99.121.210:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.121.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 27 17:04:45.057603 kubelet[2423]: I0527 17:04:45.057563 2423 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 17:04:45.058333 kubelet[2423]: I0527 17:04:45.057814 2423 server.go:1289] "Started kubelet" May 27 17:04:45.061038 kubelet[2423]: I0527 17:04:45.061006 2423 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:04:45.062960 kubelet[2423]: E0527 17:04:45.061406 2423 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.121.210:6443/api/v1/namespaces/default/events\": dial tcp 91.99.121.210:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344-0-0-0-39ed1690e8.184371253a690d23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344-0-0-0-39ed1690e8,UID:ci-4344-0-0-0-39ed1690e8,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344-0-0-0-39ed1690e8,},FirstTimestamp:2025-05-27 17:04:45.057576227 +0000 UTC m=+0.397004183,LastTimestamp:2025-05-27 17:04:45.057576227 +0000 UTC m=+0.397004183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-0-0-0-39ed1690e8,}" May 27 17:04:45.065147 kubelet[2423]: I0527 17:04:45.064906 2423 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:04:45.067277 kubelet[2423]: I0527 17:04:45.067211 2423 server.go:317] "Adding debug handlers to kubelet server" May 27 17:04:45.073401 kubelet[2423]: I0527 17:04:45.072325 2423 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 17:04:45.073401 kubelet[2423]: E0527 17:04:45.072654 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:45.073401 kubelet[2423]: I0527 17:04:45.072614 2423 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:04:45.073401 kubelet[2423]: I0527 17:04:45.073004 2423 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 17:04:45.073401 kubelet[2423]: I0527 17:04:45.073056 2423 reconciler.go:26] "Reconciler: start to sync state" May 27 17:04:45.073401 kubelet[2423]: I0527 17:04:45.073274 2423 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:04:45.073981 kubelet[2423]: I0527 17:04:45.073960 2423 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:04:45.075928 kubelet[2423]: E0527 17:04:45.075872 2423 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.121.210:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-0-0-0-39ed1690e8?timeout=10s\": dial tcp 91.99.121.210:6443: connect: connection refused" interval="200ms" May 27 17:04:45.075928 kubelet[2423]: E0527 17:04:45.075784 2423 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://91.99.121.210:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.121.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 27 17:04:45.076567 kubelet[2423]: I0527 17:04:45.074560 2423 factory.go:223] Registration of the systemd container factory successfully May 27 17:04:45.076786 kubelet[2423]: I0527 17:04:45.076747 2423 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:04:45.077941 kubelet[2423]: E0527 17:04:45.077907 2423 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:04:45.079665 kubelet[2423]: I0527 17:04:45.079633 2423 factory.go:223] Registration of the containerd container factory successfully May 27 17:04:45.099450 kubelet[2423]: I0527 17:04:45.099344 2423 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 17:04:45.102087 kubelet[2423]: I0527 17:04:45.102053 2423 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 17:04:45.102087 kubelet[2423]: I0527 17:04:45.102085 2423 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 17:04:45.102216 kubelet[2423]: I0527 17:04:45.102109 2423 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 17:04:45.102216 kubelet[2423]: I0527 17:04:45.102117 2423 kubelet.go:2436] "Starting kubelet main sync loop" May 27 17:04:45.102216 kubelet[2423]: E0527 17:04:45.102160 2423 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:04:45.104466 kubelet[2423]: E0527 17:04:45.104425 2423 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://91.99.121.210:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.121.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" May 27 17:04:45.106461 kubelet[2423]: I0527 17:04:45.106335 2423 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 17:04:45.106461 kubelet[2423]: I0527 17:04:45.106353 2423 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 17:04:45.106723 kubelet[2423]: I0527 17:04:45.106608 2423 state_mem.go:36] "Initialized new in-memory state store" May 27 17:04:45.108885 kubelet[2423]: I0527 17:04:45.108833 2423 policy_none.go:49] "None policy: Start" May 27 17:04:45.108885 kubelet[2423]: I0527 17:04:45.108862 2423 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 17:04:45.109023 kubelet[2423]: I0527 17:04:45.109014 2423 state_mem.go:35] "Initializing new in-memory state store" May 27 17:04:45.117427 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 17:04:45.135754 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 17:04:45.140575 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 17:04:45.160225 kubelet[2423]: E0527 17:04:45.160167 2423 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 17:04:45.160884 kubelet[2423]: I0527 17:04:45.160852 2423 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:04:45.161068 kubelet[2423]: I0527 17:04:45.161018 2423 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:04:45.161646 kubelet[2423]: I0527 17:04:45.161622 2423 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:04:45.170256 kubelet[2423]: E0527 17:04:45.170191 2423 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 17:04:45.170421 kubelet[2423]: E0527 17:04:45.170325 2423 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:45.222331 systemd[1]: Created slice kubepods-burstable-podd28824efaeed12ce96abc444bc09c667.slice - libcontainer container kubepods-burstable-podd28824efaeed12ce96abc444bc09c667.slice. May 27 17:04:45.235477 kubelet[2423]: E0527 17:04:45.235302 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.237468 systemd[1]: Created slice kubepods-burstable-pod4c0cf5851fa896a09c67c18882d9a233.slice - libcontainer container kubepods-burstable-pod4c0cf5851fa896a09c67c18882d9a233.slice. May 27 17:04:45.243030 kubelet[2423]: E0527 17:04:45.242692 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.258484 systemd[1]: Created slice kubepods-burstable-pod711166a50a34aecb16e96b608b675d3d.slice - libcontainer container kubepods-burstable-pod711166a50a34aecb16e96b608b675d3d.slice. May 27 17:04:45.261317 kubelet[2423]: E0527 17:04:45.261285 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.263229 kubelet[2423]: I0527 17:04:45.263202 2423 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.263853 kubelet[2423]: E0527 17:04:45.263824 2423 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.121.210:6443/api/v1/nodes\": dial tcp 91.99.121.210:6443: connect: connection refused" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.276914 kubelet[2423]: E0527 17:04:45.276855 2423 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.121.210:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-0-0-0-39ed1690e8?timeout=10s\": dial tcp 91.99.121.210:6443: connect: connection refused" interval="400ms" May 27 17:04:45.374613 kubelet[2423]: I0527 17:04:45.374520 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d28824efaeed12ce96abc444bc09c667-ca-certs\") pod \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" (UID: \"d28824efaeed12ce96abc444bc09c667\") " pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.374613 kubelet[2423]: I0527 17:04:45.374645 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d28824efaeed12ce96abc444bc09c667-k8s-certs\") pod \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" (UID: \"d28824efaeed12ce96abc444bc09c667\") " pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.375135 kubelet[2423]: I0527 17:04:45.375091 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-ca-certs\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.375225 kubelet[2423]: I0527 17:04:45.375169 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.375225 kubelet[2423]: I0527 17:04:45.375209 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-k8s-certs\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.375439 kubelet[2423]: I0527 17:04:45.375260 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-kubeconfig\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.375659 kubelet[2423]: I0527 17:04:45.375502 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.375659 kubelet[2423]: I0527 17:04:45.375585 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/711166a50a34aecb16e96b608b675d3d-kubeconfig\") pod \"kube-scheduler-ci-4344-0-0-0-39ed1690e8\" (UID: \"711166a50a34aecb16e96b608b675d3d\") " pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.375659 kubelet[2423]: I0527 17:04:45.375615 2423 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d28824efaeed12ce96abc444bc09c667-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" (UID: \"d28824efaeed12ce96abc444bc09c667\") " pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.467853 kubelet[2423]: I0527 17:04:45.467532 2423 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.468267 kubelet[2423]: E0527 17:04:45.468223 2423 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.121.210:6443/api/v1/nodes\": dial tcp 91.99.121.210:6443: connect: connection refused" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.538359 containerd[1514]: time="2025-05-27T17:04:45.537212783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-0-0-0-39ed1690e8,Uid:d28824efaeed12ce96abc444bc09c667,Namespace:kube-system,Attempt:0,}" May 27 17:04:45.544481 containerd[1514]: time="2025-05-27T17:04:45.544430341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-0-0-0-39ed1690e8,Uid:4c0cf5851fa896a09c67c18882d9a233,Namespace:kube-system,Attempt:0,}" May 27 17:04:45.563319 containerd[1514]: time="2025-05-27T17:04:45.563168214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-0-0-0-39ed1690e8,Uid:711166a50a34aecb16e96b608b675d3d,Namespace:kube-system,Attempt:0,}" May 27 17:04:45.605113 containerd[1514]: time="2025-05-27T17:04:45.602612001Z" level=info msg="connecting to shim 34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837" address="unix:///run/containerd/s/8f86a9ad9fc98300c5cc2e35650b073b56b6564de1654d728d8e6f89874784ed" namespace=k8s.io protocol=ttrpc version=3 May 27 17:04:45.618850 containerd[1514]: time="2025-05-27T17:04:45.618756635Z" level=info msg="connecting to shim 293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed" address="unix:///run/containerd/s/0f8019a487d134e59daa100c199602576b258b037814050deb26451686232cdf" namespace=k8s.io protocol=ttrpc version=3 May 27 17:04:45.631573 containerd[1514]: time="2025-05-27T17:04:45.631440271Z" level=info msg="connecting to shim e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273" address="unix:///run/containerd/s/142c7be52b95606d9ae6d174ceb6f92ab0116fbe3751ffc0bf1ad9935bd48910" namespace=k8s.io protocol=ttrpc version=3 May 27 17:04:45.647670 systemd[1]: Started cri-containerd-34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837.scope - libcontainer container 34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837. May 27 17:04:45.661934 systemd[1]: Started cri-containerd-293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed.scope - libcontainer container 293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed. May 27 17:04:45.678130 kubelet[2423]: E0527 17:04:45.678052 2423 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.121.210:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-0-0-0-39ed1690e8?timeout=10s\": dial tcp 91.99.121.210:6443: connect: connection refused" interval="800ms" May 27 17:04:45.682698 systemd[1]: Started cri-containerd-e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273.scope - libcontainer container e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273. May 27 17:04:45.760128 containerd[1514]: time="2025-05-27T17:04:45.760072907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-0-0-0-39ed1690e8,Uid:d28824efaeed12ce96abc444bc09c667,Namespace:kube-system,Attempt:0,} returns sandbox id \"34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837\"" May 27 17:04:45.765642 containerd[1514]: time="2025-05-27T17:04:45.765588745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-0-0-0-39ed1690e8,Uid:711166a50a34aecb16e96b608b675d3d,Namespace:kube-system,Attempt:0,} returns sandbox id \"293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed\"" May 27 17:04:45.771584 containerd[1514]: time="2025-05-27T17:04:45.771353743Z" level=info msg="CreateContainer within sandbox \"34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 17:04:45.774722 containerd[1514]: time="2025-05-27T17:04:45.774676742Z" level=info msg="CreateContainer within sandbox \"293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 17:04:45.787420 containerd[1514]: time="2025-05-27T17:04:45.786755938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-0-0-0-39ed1690e8,Uid:4c0cf5851fa896a09c67c18882d9a233,Namespace:kube-system,Attempt:0,} returns sandbox id \"e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273\"" May 27 17:04:45.795867 containerd[1514]: time="2025-05-27T17:04:45.795678734Z" level=info msg="CreateContainer within sandbox \"e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 17:04:45.796888 containerd[1514]: time="2025-05-27T17:04:45.796851374Z" level=info msg="Container bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50: CDI devices from CRI Config.CDIDevices: []" May 27 17:04:45.799359 containerd[1514]: time="2025-05-27T17:04:45.799308973Z" level=info msg="Container efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0: CDI devices from CRI Config.CDIDevices: []" May 27 17:04:45.808740 containerd[1514]: time="2025-05-27T17:04:45.808687050Z" level=info msg="CreateContainer within sandbox \"34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50\"" May 27 17:04:45.810342 containerd[1514]: time="2025-05-27T17:04:45.810229649Z" level=info msg="StartContainer for \"bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50\"" May 27 17:04:45.814386 containerd[1514]: time="2025-05-27T17:04:45.813625168Z" level=info msg="Container a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43: CDI devices from CRI Config.CDIDevices: []" May 27 17:04:45.814386 containerd[1514]: time="2025-05-27T17:04:45.814136568Z" level=info msg="connecting to shim bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50" address="unix:///run/containerd/s/8f86a9ad9fc98300c5cc2e35650b073b56b6564de1654d728d8e6f89874784ed" protocol=ttrpc version=3 May 27 17:04:45.819839 containerd[1514]: time="2025-05-27T17:04:45.819773366Z" level=info msg="CreateContainer within sandbox \"293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0\"" May 27 17:04:45.821066 containerd[1514]: time="2025-05-27T17:04:45.821015526Z" level=info msg="StartContainer for \"efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0\"" May 27 17:04:45.822325 containerd[1514]: time="2025-05-27T17:04:45.822284525Z" level=info msg="connecting to shim efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0" address="unix:///run/containerd/s/0f8019a487d134e59daa100c199602576b258b037814050deb26451686232cdf" protocol=ttrpc version=3 May 27 17:04:45.830656 containerd[1514]: time="2025-05-27T17:04:45.830597042Z" level=info msg="CreateContainer within sandbox \"e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43\"" May 27 17:04:45.831568 containerd[1514]: time="2025-05-27T17:04:45.831533642Z" level=info msg="StartContainer for \"a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43\"" May 27 17:04:45.834020 containerd[1514]: time="2025-05-27T17:04:45.833979601Z" level=info msg="connecting to shim a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43" address="unix:///run/containerd/s/142c7be52b95606d9ae6d174ceb6f92ab0116fbe3751ffc0bf1ad9935bd48910" protocol=ttrpc version=3 May 27 17:04:45.837842 systemd[1]: Started cri-containerd-bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50.scope - libcontainer container bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50. May 27 17:04:45.858809 systemd[1]: Started cri-containerd-efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0.scope - libcontainer container efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0. May 27 17:04:45.860327 kubelet[2423]: E0527 17:04:45.860281 2423 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://91.99.121.210:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-0-0-0-39ed1690e8&limit=500&resourceVersion=0\": dial tcp 91.99.121.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 27 17:04:45.870792 systemd[1]: Started cri-containerd-a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43.scope - libcontainer container a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43. May 27 17:04:45.878052 kubelet[2423]: I0527 17:04:45.877980 2423 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.878695 kubelet[2423]: E0527 17:04:45.878660 2423 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.121.210:6443/api/v1/nodes\": dial tcp 91.99.121.210:6443: connect: connection refused" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:45.934941 containerd[1514]: time="2025-05-27T17:04:45.934679847Z" level=info msg="StartContainer for \"bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50\" returns successfully" May 27 17:04:45.950638 containerd[1514]: time="2025-05-27T17:04:45.950569041Z" level=info msg="StartContainer for \"efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0\" returns successfully" May 27 17:04:45.967692 containerd[1514]: time="2025-05-27T17:04:45.967570956Z" level=info msg="StartContainer for \"a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43\" returns successfully" May 27 17:04:46.116472 kubelet[2423]: E0527 17:04:46.116258 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:46.121387 kubelet[2423]: E0527 17:04:46.120938 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:46.125615 kubelet[2423]: E0527 17:04:46.125584 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:46.681209 kubelet[2423]: I0527 17:04:46.681146 2423 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:47.129960 kubelet[2423]: E0527 17:04:47.129914 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:47.132936 kubelet[2423]: E0527 17:04:47.132890 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:48.183028 kubelet[2423]: I0527 17:04:48.182967 2423 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:48.183028 kubelet[2423]: E0527 17:04:48.183016 2423 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4344-0-0-0-39ed1690e8\": node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.216041 kubelet[2423]: E0527 17:04:48.216001 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.317128 kubelet[2423]: E0527 17:04:48.317064 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.417818 kubelet[2423]: E0527 17:04:48.417723 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.519039 kubelet[2423]: E0527 17:04:48.518483 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.619466 kubelet[2423]: E0527 17:04:48.619413 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.698828 kubelet[2423]: E0527 17:04:48.698692 2423 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:48.720437 kubelet[2423]: E0527 17:04:48.720387 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.821315 kubelet[2423]: E0527 17:04:48.821251 2423 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:48.973472 kubelet[2423]: I0527 17:04:48.973261 2423 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" May 27 17:04:48.988381 kubelet[2423]: I0527 17:04:48.988215 2423 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:49.002072 kubelet[2423]: I0527 17:04:49.001684 2423 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:49.059614 kubelet[2423]: I0527 17:04:49.059580 2423 apiserver.go:52] "Watching apiserver" May 27 17:04:49.074197 kubelet[2423]: I0527 17:04:49.074087 2423 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 17:04:50.677072 systemd[1]: Reload requested from client PID 2706 ('systemctl') (unit session-7.scope)... May 27 17:04:50.677508 systemd[1]: Reloading... May 27 17:04:50.787419 zram_generator::config[2750]: No configuration found. May 27 17:04:50.878907 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:04:50.996702 systemd[1]: Reloading finished in 318 ms. May 27 17:04:51.025474 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:51.042873 systemd[1]: kubelet.service: Deactivated successfully. May 27 17:04:51.043674 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:51.043901 systemd[1]: kubelet.service: Consumed 848ms CPU time, 127.3M memory peak. May 27 17:04:51.049343 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:04:51.207790 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:04:51.219895 (kubelet)[2795]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:04:51.272253 kubelet[2795]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:04:51.272253 kubelet[2795]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 17:04:51.272253 kubelet[2795]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:04:51.272253 kubelet[2795]: I0527 17:04:51.271348 2795 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:04:51.285139 kubelet[2795]: I0527 17:04:51.285103 2795 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 17:04:51.285351 kubelet[2795]: I0527 17:04:51.285339 2795 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:04:51.285811 kubelet[2795]: I0527 17:04:51.285793 2795 server.go:956] "Client rotation is on, will bootstrap in background" May 27 17:04:51.287275 kubelet[2795]: I0527 17:04:51.287243 2795 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" May 27 17:04:51.290171 kubelet[2795]: I0527 17:04:51.290121 2795 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:04:51.295686 kubelet[2795]: I0527 17:04:51.295643 2795 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:04:51.300348 kubelet[2795]: I0527 17:04:51.300304 2795 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:04:51.304389 kubelet[2795]: I0527 17:04:51.304247 2795 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:04:51.304941 kubelet[2795]: I0527 17:04:51.304313 2795 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-0-0-0-39ed1690e8","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:04:51.305794 kubelet[2795]: I0527 17:04:51.305766 2795 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:04:51.306189 kubelet[2795]: I0527 17:04:51.305873 2795 container_manager_linux.go:303] "Creating device plugin manager" May 27 17:04:51.306189 kubelet[2795]: I0527 17:04:51.305945 2795 state_mem.go:36] "Initialized new in-memory state store" May 27 17:04:51.306316 kubelet[2795]: I0527 17:04:51.306304 2795 kubelet.go:480] "Attempting to sync node with API server" May 27 17:04:51.306534 kubelet[2795]: I0527 17:04:51.306447 2795 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:04:51.306643 kubelet[2795]: I0527 17:04:51.306632 2795 kubelet.go:386] "Adding apiserver pod source" May 27 17:04:51.307266 kubelet[2795]: I0527 17:04:51.307248 2795 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:04:51.312809 kubelet[2795]: I0527 17:04:51.312781 2795 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:04:51.313529 kubelet[2795]: I0527 17:04:51.313508 2795 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 17:04:51.317987 kubelet[2795]: I0527 17:04:51.317962 2795 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 17:04:51.319503 kubelet[2795]: I0527 17:04:51.319476 2795 server.go:1289] "Started kubelet" May 27 17:04:51.326807 kubelet[2795]: I0527 17:04:51.326612 2795 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:04:51.333683 kubelet[2795]: I0527 17:04:51.333456 2795 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:04:51.343143 kubelet[2795]: I0527 17:04:51.343113 2795 server.go:317] "Adding debug handlers to kubelet server" May 27 17:04:51.344533 kubelet[2795]: I0527 17:04:51.338069 2795 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:04:51.345780 kubelet[2795]: I0527 17:04:51.333680 2795 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:04:51.346221 kubelet[2795]: I0527 17:04:51.346190 2795 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:04:51.346333 kubelet[2795]: E0527 17:04:51.339832 2795 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-0-39ed1690e8\" not found" May 27 17:04:51.346472 kubelet[2795]: I0527 17:04:51.339646 2795 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 17:04:51.346980 kubelet[2795]: I0527 17:04:51.339659 2795 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 17:04:51.347118 kubelet[2795]: I0527 17:04:51.347097 2795 reconciler.go:26] "Reconciler: start to sync state" May 27 17:04:51.357927 kubelet[2795]: I0527 17:04:51.357550 2795 factory.go:223] Registration of the systemd container factory successfully May 27 17:04:51.357927 kubelet[2795]: I0527 17:04:51.357677 2795 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:04:51.368388 kubelet[2795]: I0527 17:04:51.367489 2795 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 17:04:51.370379 kubelet[2795]: I0527 17:04:51.369701 2795 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 17:04:51.370379 kubelet[2795]: I0527 17:04:51.369744 2795 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 17:04:51.370379 kubelet[2795]: I0527 17:04:51.369767 2795 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 17:04:51.370379 kubelet[2795]: I0527 17:04:51.369773 2795 kubelet.go:2436] "Starting kubelet main sync loop" May 27 17:04:51.370379 kubelet[2795]: E0527 17:04:51.369818 2795 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:04:51.371674 kubelet[2795]: I0527 17:04:51.370608 2795 factory.go:223] Registration of the containerd container factory successfully May 27 17:04:51.431861 kubelet[2795]: I0527 17:04:51.431825 2795 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 17:04:51.431861 kubelet[2795]: I0527 17:04:51.431845 2795 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 17:04:51.431861 kubelet[2795]: I0527 17:04:51.431867 2795 state_mem.go:36] "Initialized new in-memory state store" May 27 17:04:51.432025 kubelet[2795]: I0527 17:04:51.432005 2795 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 17:04:51.432025 kubelet[2795]: I0527 17:04:51.432014 2795 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 17:04:51.432126 kubelet[2795]: I0527 17:04:51.432030 2795 policy_none.go:49] "None policy: Start" May 27 17:04:51.432126 kubelet[2795]: I0527 17:04:51.432039 2795 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 17:04:51.432126 kubelet[2795]: I0527 17:04:51.432047 2795 state_mem.go:35] "Initializing new in-memory state store" May 27 17:04:51.432126 kubelet[2795]: I0527 17:04:51.432126 2795 state_mem.go:75] "Updated machine memory state" May 27 17:04:51.438087 kubelet[2795]: E0527 17:04:51.438053 2795 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 17:04:51.438253 kubelet[2795]: I0527 17:04:51.438235 2795 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:04:51.438309 kubelet[2795]: I0527 17:04:51.438254 2795 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:04:51.440933 kubelet[2795]: I0527 17:04:51.440906 2795 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:04:51.444527 kubelet[2795]: E0527 17:04:51.444465 2795 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 17:04:51.471522 kubelet[2795]: I0527 17:04:51.471483 2795 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.472418 kubelet[2795]: I0527 17:04:51.472304 2795 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.472769 kubelet[2795]: I0527 17:04:51.472351 2795 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.482893 kubelet[2795]: E0527 17:04:51.482851 2795 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" already exists" pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.483627 kubelet[2795]: E0527 17:04:51.483574 2795 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-0-0-0-39ed1690e8\" already exists" pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.483789 kubelet[2795]: E0527 17:04:51.483772 2795 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" already exists" pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.546142 kubelet[2795]: I0527 17:04:51.546091 2795 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.559107 kubelet[2795]: I0527 17:04:51.558962 2795 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.559548 kubelet[2795]: I0527 17:04:51.559418 2795 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.648882 kubelet[2795]: I0527 17:04:51.648825 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/711166a50a34aecb16e96b608b675d3d-kubeconfig\") pod \"kube-scheduler-ci-4344-0-0-0-39ed1690e8\" (UID: \"711166a50a34aecb16e96b608b675d3d\") " pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649159 kubelet[2795]: I0527 17:04:51.649006 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d28824efaeed12ce96abc444bc09c667-ca-certs\") pod \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" (UID: \"d28824efaeed12ce96abc444bc09c667\") " pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649397 kubelet[2795]: I0527 17:04:51.649249 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d28824efaeed12ce96abc444bc09c667-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" (UID: \"d28824efaeed12ce96abc444bc09c667\") " pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649397 kubelet[2795]: I0527 17:04:51.649280 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-ca-certs\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649397 kubelet[2795]: I0527 17:04:51.649334 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649612 kubelet[2795]: I0527 17:04:51.649353 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d28824efaeed12ce96abc444bc09c667-k8s-certs\") pod \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" (UID: \"d28824efaeed12ce96abc444bc09c667\") " pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649612 kubelet[2795]: I0527 17:04:51.649584 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-k8s-certs\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649780 kubelet[2795]: I0527 17:04:51.649746 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-kubeconfig\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:51.649966 kubelet[2795]: I0527 17:04:51.649894 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4c0cf5851fa896a09c67c18882d9a233-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" (UID: \"4c0cf5851fa896a09c67c18882d9a233\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:52.308337 kubelet[2795]: I0527 17:04:52.308161 2795 apiserver.go:52] "Watching apiserver" May 27 17:04:52.348186 kubelet[2795]: I0527 17:04:52.348036 2795 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 17:04:52.419891 kubelet[2795]: I0527 17:04:52.419350 2795 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:52.420437 kubelet[2795]: I0527 17:04:52.420410 2795 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" May 27 17:04:52.421310 kubelet[2795]: I0527 17:04:52.421058 2795 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:52.433643 kubelet[2795]: E0527 17:04:52.433583 2795 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344-0-0-0-39ed1690e8\" already exists" pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" May 27 17:04:52.436133 kubelet[2795]: E0527 17:04:52.436088 2795 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-0-0-0-39ed1690e8\" already exists" pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" May 27 17:04:52.437865 kubelet[2795]: E0527 17:04:52.437694 2795 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-0-0-0-39ed1690e8\" already exists" pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" May 27 17:04:52.473408 kubelet[2795]: I0527 17:04:52.472774 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344-0-0-0-39ed1690e8" podStartSLOduration=4.472687882 podStartE2EDuration="4.472687882s" podCreationTimestamp="2025-05-27 17:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:04:52.470230722 +0000 UTC m=+1.243964171" watchObservedRunningTime="2025-05-27 17:04:52.472687882 +0000 UTC m=+1.246421331" May 27 17:04:52.518696 kubelet[2795]: I0527 17:04:52.518162 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344-0-0-0-39ed1690e8" podStartSLOduration=3.518147108 podStartE2EDuration="3.518147108s" podCreationTimestamp="2025-05-27 17:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:04:52.500586553 +0000 UTC m=+1.274320002" watchObservedRunningTime="2025-05-27 17:04:52.518147108 +0000 UTC m=+1.291880557" May 27 17:04:52.518696 kubelet[2795]: I0527 17:04:52.518600 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344-0-0-0-39ed1690e8" podStartSLOduration=4.518591627 podStartE2EDuration="4.518591627s" podCreationTimestamp="2025-05-27 17:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:04:52.516169268 +0000 UTC m=+1.289902717" watchObservedRunningTime="2025-05-27 17:04:52.518591627 +0000 UTC m=+1.292325116" May 27 17:04:57.316109 kubelet[2795]: I0527 17:04:57.316063 2795 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 17:04:57.317279 containerd[1514]: time="2025-05-27T17:04:57.317152980Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 17:04:57.318784 kubelet[2795]: I0527 17:04:57.317529 2795 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 17:04:58.286293 systemd[1]: Created slice kubepods-besteffort-pode2e4e350_87c7_4aac_82e6_b4afe0a5b15d.slice - libcontainer container kubepods-besteffort-pode2e4e350_87c7_4aac_82e6_b4afe0a5b15d.slice. May 27 17:04:58.297738 kubelet[2795]: I0527 17:04:58.297568 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e2e4e350-87c7-4aac-82e6-b4afe0a5b15d-xtables-lock\") pod \"kube-proxy-xj58t\" (UID: \"e2e4e350-87c7-4aac-82e6-b4afe0a5b15d\") " pod="kube-system/kube-proxy-xj58t" May 27 17:04:58.297738 kubelet[2795]: I0527 17:04:58.297668 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2e4e350-87c7-4aac-82e6-b4afe0a5b15d-lib-modules\") pod \"kube-proxy-xj58t\" (UID: \"e2e4e350-87c7-4aac-82e6-b4afe0a5b15d\") " pod="kube-system/kube-proxy-xj58t" May 27 17:04:58.297738 kubelet[2795]: I0527 17:04:58.297733 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8knbx\" (UniqueName: \"kubernetes.io/projected/e2e4e350-87c7-4aac-82e6-b4afe0a5b15d-kube-api-access-8knbx\") pod \"kube-proxy-xj58t\" (UID: \"e2e4e350-87c7-4aac-82e6-b4afe0a5b15d\") " pod="kube-system/kube-proxy-xj58t" May 27 17:04:58.297910 kubelet[2795]: I0527 17:04:58.297795 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e2e4e350-87c7-4aac-82e6-b4afe0a5b15d-kube-proxy\") pod \"kube-proxy-xj58t\" (UID: \"e2e4e350-87c7-4aac-82e6-b4afe0a5b15d\") " pod="kube-system/kube-proxy-xj58t" May 27 17:04:58.589113 systemd[1]: Created slice kubepods-besteffort-pod08d678b4_54b5_485e_b17f_8b9390d84045.slice - libcontainer container kubepods-besteffort-pod08d678b4_54b5_485e_b17f_8b9390d84045.slice. May 27 17:04:58.598659 containerd[1514]: time="2025-05-27T17:04:58.598322488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xj58t,Uid:e2e4e350-87c7-4aac-82e6-b4afe0a5b15d,Namespace:kube-system,Attempt:0,}" May 27 17:04:58.601046 kubelet[2795]: I0527 17:04:58.600939 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/08d678b4-54b5-485e-b17f-8b9390d84045-var-lib-calico\") pod \"tigera-operator-844669ff44-qr9sv\" (UID: \"08d678b4-54b5-485e-b17f-8b9390d84045\") " pod="tigera-operator/tigera-operator-844669ff44-qr9sv" May 27 17:04:58.602249 kubelet[2795]: I0527 17:04:58.601718 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psv9t\" (UniqueName: \"kubernetes.io/projected/08d678b4-54b5-485e-b17f-8b9390d84045-kube-api-access-psv9t\") pod \"tigera-operator-844669ff44-qr9sv\" (UID: \"08d678b4-54b5-485e-b17f-8b9390d84045\") " pod="tigera-operator/tigera-operator-844669ff44-qr9sv" May 27 17:04:58.623577 containerd[1514]: time="2025-05-27T17:04:58.623480361Z" level=info msg="connecting to shim 4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c" address="unix:///run/containerd/s/8e686767ff0d2cb298c1ae8a6b368cf0c536e41d0b5d04f8c3a88b2a201107e5" namespace=k8s.io protocol=ttrpc version=3 May 27 17:04:58.652070 systemd[1]: Started cri-containerd-4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c.scope - libcontainer container 4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c. May 27 17:04:58.702527 containerd[1514]: time="2025-05-27T17:04:58.702449938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xj58t,Uid:e2e4e350-87c7-4aac-82e6-b4afe0a5b15d,Namespace:kube-system,Attempt:0,} returns sandbox id \"4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c\"" May 27 17:04:58.713379 containerd[1514]: time="2025-05-27T17:04:58.712424575Z" level=info msg="CreateContainer within sandbox \"4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 17:04:58.730174 containerd[1514]: time="2025-05-27T17:04:58.730121090Z" level=info msg="Container ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c: CDI devices from CRI Config.CDIDevices: []" May 27 17:04:58.746005 containerd[1514]: time="2025-05-27T17:04:58.745618486Z" level=info msg="CreateContainer within sandbox \"4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c\"" May 27 17:04:58.747902 containerd[1514]: time="2025-05-27T17:04:58.746916205Z" level=info msg="StartContainer for \"ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c\"" May 27 17:04:58.750435 containerd[1514]: time="2025-05-27T17:04:58.750352564Z" level=info msg="connecting to shim ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c" address="unix:///run/containerd/s/8e686767ff0d2cb298c1ae8a6b368cf0c536e41d0b5d04f8c3a88b2a201107e5" protocol=ttrpc version=3 May 27 17:04:58.773713 systemd[1]: Started cri-containerd-ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c.scope - libcontainer container ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c. May 27 17:04:58.818093 containerd[1514]: time="2025-05-27T17:04:58.818060465Z" level=info msg="StartContainer for \"ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c\" returns successfully" May 27 17:04:58.895204 containerd[1514]: time="2025-05-27T17:04:58.894459163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-qr9sv,Uid:08d678b4-54b5-485e-b17f-8b9390d84045,Namespace:tigera-operator,Attempt:0,}" May 27 17:04:58.922644 containerd[1514]: time="2025-05-27T17:04:58.922597875Z" level=info msg="connecting to shim f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353" address="unix:///run/containerd/s/cdb445b5fafa0fddd39db3ad7b48bfc743e434366f62bbd9dcb074ae039744c7" namespace=k8s.io protocol=ttrpc version=3 May 27 17:04:58.958190 systemd[1]: Started cri-containerd-f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353.scope - libcontainer container f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353. May 27 17:04:59.020589 containerd[1514]: time="2025-05-27T17:04:59.020536806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-qr9sv,Uid:08d678b4-54b5-485e-b17f-8b9390d84045,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353\"" May 27 17:04:59.024635 containerd[1514]: time="2025-05-27T17:04:59.024578965Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 17:04:59.421391 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3222970591.mount: Deactivated successfully. May 27 17:04:59.455259 kubelet[2795]: I0527 17:04:59.455021 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-xj58t" podStartSLOduration=1.454996682 podStartE2EDuration="1.454996682s" podCreationTimestamp="2025-05-27 17:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:04:59.453353203 +0000 UTC m=+8.227086652" watchObservedRunningTime="2025-05-27 17:04:59.454996682 +0000 UTC m=+8.228730171" May 27 17:05:01.131131 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2442234420.mount: Deactivated successfully. May 27 17:05:02.080460 containerd[1514]: time="2025-05-27T17:05:02.079641023Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:02.081508 containerd[1514]: time="2025-05-27T17:05:02.081463183Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=22143480" May 27 17:05:02.082345 containerd[1514]: time="2025-05-27T17:05:02.082316183Z" level=info msg="ImageCreate event name:\"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:02.085815 containerd[1514]: time="2025-05-27T17:05:02.085763622Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:02.086423 containerd[1514]: time="2025-05-27T17:05:02.086358102Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"22139475\" in 3.061525777s" May 27 17:05:02.086423 containerd[1514]: time="2025-05-27T17:05:02.086422902Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\"" May 27 17:05:02.092124 containerd[1514]: time="2025-05-27T17:05:02.092051140Z" level=info msg="CreateContainer within sandbox \"f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 17:05:02.104401 containerd[1514]: time="2025-05-27T17:05:02.103028417Z" level=info msg="Container 061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:02.107303 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3566603100.mount: Deactivated successfully. May 27 17:05:02.138088 containerd[1514]: time="2025-05-27T17:05:02.138010847Z" level=info msg="CreateContainer within sandbox \"f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6\"" May 27 17:05:02.140192 containerd[1514]: time="2025-05-27T17:05:02.138994487Z" level=info msg="StartContainer for \"061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6\"" May 27 17:05:02.140324 containerd[1514]: time="2025-05-27T17:05:02.140211847Z" level=info msg="connecting to shim 061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6" address="unix:///run/containerd/s/cdb445b5fafa0fddd39db3ad7b48bfc743e434366f62bbd9dcb074ae039744c7" protocol=ttrpc version=3 May 27 17:05:02.167794 systemd[1]: Started cri-containerd-061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6.scope - libcontainer container 061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6. May 27 17:05:02.213254 containerd[1514]: time="2025-05-27T17:05:02.213140867Z" level=info msg="StartContainer for \"061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6\" returns successfully" May 27 17:05:08.620584 sudo[1913]: pam_unix(sudo:session): session closed for user root May 27 17:05:08.785269 sshd[1912]: Connection closed by 139.178.89.65 port 35706 May 27 17:05:08.785051 sshd-session[1910]: pam_unix(sshd:session): session closed for user core May 27 17:05:08.791873 systemd[1]: session-7.scope: Deactivated successfully. May 27 17:05:08.793495 systemd[1]: session-7.scope: Consumed 7.380s CPU time, 230M memory peak. May 27 17:05:08.794987 systemd[1]: sshd@6-91.99.121.210:22-139.178.89.65:35706.service: Deactivated successfully. May 27 17:05:08.801595 systemd-logind[1494]: Session 7 logged out. Waiting for processes to exit. May 27 17:05:08.803472 systemd-logind[1494]: Removed session 7. May 27 17:05:16.312875 kubelet[2795]: I0527 17:05:16.312797 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-qr9sv" podStartSLOduration=15.248920921 podStartE2EDuration="18.312759376s" podCreationTimestamp="2025-05-27 17:04:58 +0000 UTC" firstStartedPulling="2025-05-27 17:04:59.023555446 +0000 UTC m=+7.797288895" lastFinishedPulling="2025-05-27 17:05:02.087393901 +0000 UTC m=+10.861127350" observedRunningTime="2025-05-27 17:05:02.471441795 +0000 UTC m=+11.245175284" watchObservedRunningTime="2025-05-27 17:05:16.312759376 +0000 UTC m=+25.086492825" May 27 17:05:16.324510 systemd[1]: Created slice kubepods-besteffort-podefe333ff_f045_49e2_aa45_be755dbc7622.slice - libcontainer container kubepods-besteffort-podefe333ff_f045_49e2_aa45_be755dbc7622.slice. May 27 17:05:16.420454 kubelet[2795]: I0527 17:05:16.420323 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/efe333ff-f045-49e2-aa45-be755dbc7622-typha-certs\") pod \"calico-typha-57556dd858-vlgnd\" (UID: \"efe333ff-f045-49e2-aa45-be755dbc7622\") " pod="calico-system/calico-typha-57556dd858-vlgnd" May 27 17:05:16.420454 kubelet[2795]: I0527 17:05:16.420387 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe333ff-f045-49e2-aa45-be755dbc7622-tigera-ca-bundle\") pod \"calico-typha-57556dd858-vlgnd\" (UID: \"efe333ff-f045-49e2-aa45-be755dbc7622\") " pod="calico-system/calico-typha-57556dd858-vlgnd" May 27 17:05:16.420454 kubelet[2795]: I0527 17:05:16.420407 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l46zx\" (UniqueName: \"kubernetes.io/projected/efe333ff-f045-49e2-aa45-be755dbc7622-kube-api-access-l46zx\") pod \"calico-typha-57556dd858-vlgnd\" (UID: \"efe333ff-f045-49e2-aa45-be755dbc7622\") " pod="calico-system/calico-typha-57556dd858-vlgnd" May 27 17:05:16.483352 systemd[1]: Created slice kubepods-besteffort-pod345f36e0_06c1_4f93_87c3_91fe856cc7ba.slice - libcontainer container kubepods-besteffort-pod345f36e0_06c1_4f93_87c3_91fe856cc7ba.slice. May 27 17:05:16.521345 kubelet[2795]: I0527 17:05:16.521294 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/345f36e0-06c1-4f93-87c3-91fe856cc7ba-tigera-ca-bundle\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521345 kubelet[2795]: I0527 17:05:16.521357 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-cni-net-dir\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521515 kubelet[2795]: I0527 17:05:16.521384 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-flexvol-driver-host\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521515 kubelet[2795]: I0527 17:05:16.521400 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-cni-bin-dir\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521515 kubelet[2795]: I0527 17:05:16.521416 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-lib-modules\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521515 kubelet[2795]: I0527 17:05:16.521430 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-var-run-calico\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521515 kubelet[2795]: I0527 17:05:16.521445 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-xtables-lock\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521685 kubelet[2795]: I0527 17:05:16.521459 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-var-lib-calico\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521685 kubelet[2795]: I0527 17:05:16.521487 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-policysync\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521685 kubelet[2795]: I0527 17:05:16.521500 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtm7j\" (UniqueName: \"kubernetes.io/projected/345f36e0-06c1-4f93-87c3-91fe856cc7ba-kube-api-access-mtm7j\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521685 kubelet[2795]: I0527 17:05:16.521537 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/345f36e0-06c1-4f93-87c3-91fe856cc7ba-node-certs\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.521685 kubelet[2795]: I0527 17:05:16.521554 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/345f36e0-06c1-4f93-87c3-91fe856cc7ba-cni-log-dir\") pod \"calico-node-xf245\" (UID: \"345f36e0-06c1-4f93-87c3-91fe856cc7ba\") " pod="calico-system/calico-node-xf245" May 27 17:05:16.626031 kubelet[2795]: E0527 17:05:16.625577 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.626031 kubelet[2795]: W0527 17:05:16.625668 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.626031 kubelet[2795]: E0527 17:05:16.625695 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.626881 kubelet[2795]: E0527 17:05:16.625868 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.626881 kubelet[2795]: W0527 17:05:16.626440 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.626881 kubelet[2795]: E0527 17:05:16.626460 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.626881 kubelet[2795]: E0527 17:05:16.626706 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.626881 kubelet[2795]: W0527 17:05:16.626717 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.626881 kubelet[2795]: E0527 17:05:16.626762 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.629486 kubelet[2795]: E0527 17:05:16.628449 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.629486 kubelet[2795]: W0527 17:05:16.628493 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.629486 kubelet[2795]: E0527 17:05:16.628513 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.633243 kubelet[2795]: E0527 17:05:16.631538 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.633243 kubelet[2795]: W0527 17:05:16.631563 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.633243 kubelet[2795]: E0527 17:05:16.631648 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.634321 kubelet[2795]: E0527 17:05:16.634276 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.634321 kubelet[2795]: W0527 17:05:16.634299 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.634321 kubelet[2795]: E0527 17:05:16.634321 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.635563 kubelet[2795]: E0527 17:05:16.634831 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.635563 kubelet[2795]: W0527 17:05:16.634849 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.635563 kubelet[2795]: E0527 17:05:16.634860 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.636000 kubelet[2795]: E0527 17:05:16.635981 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.636000 kubelet[2795]: W0527 17:05:16.635996 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.636199 kubelet[2795]: E0527 17:05:16.636177 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.637543 kubelet[2795]: E0527 17:05:16.637516 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.637543 kubelet[2795]: W0527 17:05:16.637536 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.637543 kubelet[2795]: E0527 17:05:16.637548 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.640957 kubelet[2795]: E0527 17:05:16.640920 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.640957 kubelet[2795]: W0527 17:05:16.640956 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.641083 kubelet[2795]: E0527 17:05:16.640980 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.647609 containerd[1514]: time="2025-05-27T17:05:16.647539775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57556dd858-vlgnd,Uid:efe333ff-f045-49e2-aa45-be755dbc7622,Namespace:calico-system,Attempt:0,}" May 27 17:05:16.655574 kubelet[2795]: E0527 17:05:16.655441 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.656781 kubelet[2795]: W0527 17:05:16.655716 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.656781 kubelet[2795]: E0527 17:05:16.656731 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.685573 containerd[1514]: time="2025-05-27T17:05:16.685527966Z" level=info msg="connecting to shim a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893" address="unix:///run/containerd/s/70df3b1af357f04baf97d6d6535d00d11ed84eccca5bff30b1bb1cf926cfde52" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:16.720706 systemd[1]: Started cri-containerd-a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893.scope - libcontainer container a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893. May 27 17:05:16.730116 kubelet[2795]: E0527 17:05:16.730052 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sx6tt" podUID="6a4311e4-69eb-4fe5-a407-eaf19f301066" May 27 17:05:16.789400 containerd[1514]: time="2025-05-27T17:05:16.789098981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xf245,Uid:345f36e0-06c1-4f93-87c3-91fe856cc7ba,Namespace:calico-system,Attempt:0,}" May 27 17:05:16.799105 kubelet[2795]: E0527 17:05:16.797618 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.799105 kubelet[2795]: W0527 17:05:16.797650 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.799105 kubelet[2795]: E0527 17:05:16.797673 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.799105 kubelet[2795]: E0527 17:05:16.798221 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.799105 kubelet[2795]: W0527 17:05:16.798234 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.799105 kubelet[2795]: E0527 17:05:16.798506 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.799105 kubelet[2795]: E0527 17:05:16.799048 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.799105 kubelet[2795]: W0527 17:05:16.799060 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.799105 kubelet[2795]: E0527 17:05:16.799072 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.799934 kubelet[2795]: E0527 17:05:16.799719 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.799934 kubelet[2795]: W0527 17:05:16.799741 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.799934 kubelet[2795]: E0527 17:05:16.799754 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.800348 kubelet[2795]: E0527 17:05:16.800313 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.800430 kubelet[2795]: W0527 17:05:16.800384 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.800430 kubelet[2795]: E0527 17:05:16.800400 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.801395 kubelet[2795]: E0527 17:05:16.800917 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.801395 kubelet[2795]: W0527 17:05:16.800938 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.801395 kubelet[2795]: E0527 17:05:16.800951 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.801395 kubelet[2795]: E0527 17:05:16.801242 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.801395 kubelet[2795]: W0527 17:05:16.801314 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.801395 kubelet[2795]: E0527 17:05:16.801326 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.802027 kubelet[2795]: E0527 17:05:16.801900 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.802027 kubelet[2795]: W0527 17:05:16.801933 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.802027 kubelet[2795]: E0527 17:05:16.801945 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.803078 kubelet[2795]: E0527 17:05:16.802993 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.803078 kubelet[2795]: W0527 17:05:16.803026 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.803078 kubelet[2795]: E0527 17:05:16.803038 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.803611 kubelet[2795]: E0527 17:05:16.803543 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.803611 kubelet[2795]: W0527 17:05:16.803564 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.803611 kubelet[2795]: E0527 17:05:16.803577 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.803891 kubelet[2795]: E0527 17:05:16.803718 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.803891 kubelet[2795]: W0527 17:05:16.803734 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.803891 kubelet[2795]: E0527 17:05:16.803743 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.803891 kubelet[2795]: E0527 17:05:16.803874 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.803891 kubelet[2795]: W0527 17:05:16.803880 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.803891 kubelet[2795]: E0527 17:05:16.803887 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.804374 kubelet[2795]: E0527 17:05:16.804243 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.804374 kubelet[2795]: W0527 17:05:16.804262 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.804374 kubelet[2795]: E0527 17:05:16.804274 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.805084 kubelet[2795]: E0527 17:05:16.804744 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.805084 kubelet[2795]: W0527 17:05:16.804757 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.805084 kubelet[2795]: E0527 17:05:16.804798 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.805084 kubelet[2795]: E0527 17:05:16.804940 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.805084 kubelet[2795]: W0527 17:05:16.804948 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.805084 kubelet[2795]: E0527 17:05:16.804954 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.805638 kubelet[2795]: E0527 17:05:16.805300 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.805638 kubelet[2795]: W0527 17:05:16.805323 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.805638 kubelet[2795]: E0527 17:05:16.805334 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.805638 kubelet[2795]: E0527 17:05:16.805623 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.805638 kubelet[2795]: W0527 17:05:16.805636 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.805638 kubelet[2795]: E0527 17:05:16.805647 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.806689 kubelet[2795]: E0527 17:05:16.806142 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.806689 kubelet[2795]: W0527 17:05:16.806161 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.806689 kubelet[2795]: E0527 17:05:16.806192 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.806689 kubelet[2795]: E0527 17:05:16.806487 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.806689 kubelet[2795]: W0527 17:05:16.806497 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.806689 kubelet[2795]: E0527 17:05:16.806507 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.807680 kubelet[2795]: E0527 17:05:16.807411 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.807680 kubelet[2795]: W0527 17:05:16.807436 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.807680 kubelet[2795]: E0527 17:05:16.807449 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.825570 kubelet[2795]: E0527 17:05:16.825503 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.825570 kubelet[2795]: W0527 17:05:16.825543 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.825570 kubelet[2795]: E0527 17:05:16.825570 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.825761 kubelet[2795]: I0527 17:05:16.825629 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6a4311e4-69eb-4fe5-a407-eaf19f301066-varrun\") pod \"csi-node-driver-sx6tt\" (UID: \"6a4311e4-69eb-4fe5-a407-eaf19f301066\") " pod="calico-system/csi-node-driver-sx6tt" May 27 17:05:16.827348 kubelet[2795]: E0527 17:05:16.825847 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.827348 kubelet[2795]: W0527 17:05:16.825865 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.827348 kubelet[2795]: E0527 17:05:16.825877 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.827348 kubelet[2795]: I0527 17:05:16.826021 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4311e4-69eb-4fe5-a407-eaf19f301066-kubelet-dir\") pod \"csi-node-driver-sx6tt\" (UID: \"6a4311e4-69eb-4fe5-a407-eaf19f301066\") " pod="calico-system/csi-node-driver-sx6tt" May 27 17:05:16.827348 kubelet[2795]: E0527 17:05:16.826097 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.827348 kubelet[2795]: W0527 17:05:16.826105 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.827348 kubelet[2795]: E0527 17:05:16.826114 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.827348 kubelet[2795]: E0527 17:05:16.826274 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.827348 kubelet[2795]: W0527 17:05:16.826281 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.827749 kubelet[2795]: E0527 17:05:16.826289 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.827749 kubelet[2795]: E0527 17:05:16.826488 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.827749 kubelet[2795]: W0527 17:05:16.826498 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.827749 kubelet[2795]: E0527 17:05:16.826506 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.827749 kubelet[2795]: I0527 17:05:16.826636 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6a4311e4-69eb-4fe5-a407-eaf19f301066-registration-dir\") pod \"csi-node-driver-sx6tt\" (UID: \"6a4311e4-69eb-4fe5-a407-eaf19f301066\") " pod="calico-system/csi-node-driver-sx6tt" May 27 17:05:16.827749 kubelet[2795]: E0527 17:05:16.826764 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.827749 kubelet[2795]: W0527 17:05:16.826773 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.827749 kubelet[2795]: E0527 17:05:16.826802 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.827749 kubelet[2795]: E0527 17:05:16.827051 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.827958 kubelet[2795]: W0527 17:05:16.827060 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.827958 kubelet[2795]: E0527 17:05:16.827072 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.827958 kubelet[2795]: E0527 17:05:16.827457 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.827958 kubelet[2795]: W0527 17:05:16.827502 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.827958 kubelet[2795]: E0527 17:05:16.827514 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.827958 kubelet[2795]: I0527 17:05:16.827784 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6a4311e4-69eb-4fe5-a407-eaf19f301066-socket-dir\") pod \"csi-node-driver-sx6tt\" (UID: \"6a4311e4-69eb-4fe5-a407-eaf19f301066\") " pod="calico-system/csi-node-driver-sx6tt" May 27 17:05:16.828083 kubelet[2795]: E0527 17:05:16.827962 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.828083 kubelet[2795]: W0527 17:05:16.827988 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.828083 kubelet[2795]: E0527 17:05:16.827998 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.828639 kubelet[2795]: E0527 17:05:16.828462 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.828639 kubelet[2795]: W0527 17:05:16.828480 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.828639 kubelet[2795]: E0527 17:05:16.828491 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.829704 kubelet[2795]: E0527 17:05:16.828845 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.829704 kubelet[2795]: W0527 17:05:16.828864 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.829704 kubelet[2795]: E0527 17:05:16.828924 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.829704 kubelet[2795]: I0527 17:05:16.829063 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh4zw\" (UniqueName: \"kubernetes.io/projected/6a4311e4-69eb-4fe5-a407-eaf19f301066-kube-api-access-mh4zw\") pod \"csi-node-driver-sx6tt\" (UID: \"6a4311e4-69eb-4fe5-a407-eaf19f301066\") " pod="calico-system/csi-node-driver-sx6tt" May 27 17:05:16.829704 kubelet[2795]: E0527 17:05:16.829629 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.829704 kubelet[2795]: W0527 17:05:16.829644 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.829704 kubelet[2795]: E0527 17:05:16.829655 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.829988 kubelet[2795]: E0527 17:05:16.829873 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.829988 kubelet[2795]: W0527 17:05:16.829883 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.829988 kubelet[2795]: E0527 17:05:16.829891 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.830233 kubelet[2795]: E0527 17:05:16.830098 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.830233 kubelet[2795]: W0527 17:05:16.830108 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.830233 kubelet[2795]: E0527 17:05:16.830116 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.830660 kubelet[2795]: E0527 17:05:16.830280 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.830660 kubelet[2795]: W0527 17:05:16.830288 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.830660 kubelet[2795]: E0527 17:05:16.830297 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.834013 containerd[1514]: time="2025-05-27T17:05:16.833762250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-57556dd858-vlgnd,Uid:efe333ff-f045-49e2-aa45-be755dbc7622,Namespace:calico-system,Attempt:0,} returns sandbox id \"a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893\"" May 27 17:05:16.839387 containerd[1514]: time="2025-05-27T17:05:16.838508369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 17:05:16.841245 containerd[1514]: time="2025-05-27T17:05:16.841066088Z" level=info msg="connecting to shim 66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c" address="unix:///run/containerd/s/9b7cb97e0dab7cb7540ae32cd84242f9070527a884fb1cdc8feea45ec78a6a5d" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:16.894561 systemd[1]: Started cri-containerd-66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c.scope - libcontainer container 66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c. May 27 17:05:16.933230 kubelet[2795]: E0527 17:05:16.933130 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.933230 kubelet[2795]: W0527 17:05:16.933214 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.933230 kubelet[2795]: E0527 17:05:16.933241 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.934856 kubelet[2795]: E0527 17:05:16.934303 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.934856 kubelet[2795]: W0527 17:05:16.934328 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.934856 kubelet[2795]: E0527 17:05:16.934349 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.936082 kubelet[2795]: E0527 17:05:16.935002 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.936082 kubelet[2795]: W0527 17:05:16.935125 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.936082 kubelet[2795]: E0527 17:05:16.935150 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.936255 kubelet[2795]: E0527 17:05:16.936175 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.936255 kubelet[2795]: W0527 17:05:16.936189 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.936255 kubelet[2795]: E0527 17:05:16.936202 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.938631 kubelet[2795]: E0527 17:05:16.938442 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.938631 kubelet[2795]: W0527 17:05:16.938596 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.938631 kubelet[2795]: E0527 17:05:16.938639 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.939737 kubelet[2795]: E0527 17:05:16.939695 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.939737 kubelet[2795]: W0527 17:05:16.939732 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.939908 kubelet[2795]: E0527 17:05:16.939750 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.939957 kubelet[2795]: E0527 17:05:16.939938 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.939957 kubelet[2795]: W0527 17:05:16.939946 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.940017 kubelet[2795]: E0527 17:05:16.939969 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.940477 kubelet[2795]: E0527 17:05:16.940139 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.940477 kubelet[2795]: W0527 17:05:16.940155 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.940477 kubelet[2795]: E0527 17:05:16.940163 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.940618 kubelet[2795]: E0527 17:05:16.940515 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.940618 kubelet[2795]: W0527 17:05:16.940525 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.940844 kubelet[2795]: E0527 17:05:16.940799 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.941068 kubelet[2795]: E0527 17:05:16.941051 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.941109 kubelet[2795]: W0527 17:05:16.941068 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.941109 kubelet[2795]: E0527 17:05:16.941083 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.941693 kubelet[2795]: E0527 17:05:16.941673 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.941693 kubelet[2795]: W0527 17:05:16.941688 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.941770 kubelet[2795]: E0527 17:05:16.941700 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.943126 kubelet[2795]: E0527 17:05:16.943100 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.943126 kubelet[2795]: W0527 17:05:16.943120 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.943802 kubelet[2795]: E0527 17:05:16.943135 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.943802 kubelet[2795]: E0527 17:05:16.943695 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.943890 kubelet[2795]: W0527 17:05:16.943709 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.943890 kubelet[2795]: E0527 17:05:16.943838 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.945115 kubelet[2795]: E0527 17:05:16.945073 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.945537 kubelet[2795]: W0527 17:05:16.945512 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.945600 kubelet[2795]: E0527 17:05:16.945541 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.946838 kubelet[2795]: E0527 17:05:16.946666 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.946838 kubelet[2795]: W0527 17:05:16.946688 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.946838 kubelet[2795]: E0527 17:05:16.946704 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.947560 kubelet[2795]: E0527 17:05:16.947143 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.947560 kubelet[2795]: W0527 17:05:16.947161 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.947560 kubelet[2795]: E0527 17:05:16.947172 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.947560 kubelet[2795]: E0527 17:05:16.947527 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.947560 kubelet[2795]: W0527 17:05:16.947549 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.947560 kubelet[2795]: E0527 17:05:16.947560 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.948109 kubelet[2795]: E0527 17:05:16.948088 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.948109 kubelet[2795]: W0527 17:05:16.948103 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.948435 kubelet[2795]: E0527 17:05:16.948117 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.948852 kubelet[2795]: E0527 17:05:16.948753 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.948852 kubelet[2795]: W0527 17:05:16.948769 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.948852 kubelet[2795]: E0527 17:05:16.948784 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.950172 kubelet[2795]: E0527 17:05:16.950147 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.950256 kubelet[2795]: W0527 17:05:16.950242 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.950324 kubelet[2795]: E0527 17:05:16.950312 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.951106 kubelet[2795]: E0527 17:05:16.951080 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.951356 kubelet[2795]: W0527 17:05:16.951334 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.951655 kubelet[2795]: E0527 17:05:16.951571 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.953045 kubelet[2795]: E0527 17:05:16.952709 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.953287 kubelet[2795]: W0527 17:05:16.953186 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.953545 kubelet[2795]: E0527 17:05:16.953510 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.955807 kubelet[2795]: E0527 17:05:16.955699 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.956250 kubelet[2795]: W0527 17:05:16.956074 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.956250 kubelet[2795]: E0527 17:05:16.956109 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.957617 kubelet[2795]: E0527 17:05:16.957593 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.957731 kubelet[2795]: W0527 17:05:16.957717 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.959426 kubelet[2795]: E0527 17:05:16.959398 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.960356 containerd[1514]: time="2025-05-27T17:05:16.960301139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xf245,Uid:345f36e0-06c1-4f93-87c3-91fe856cc7ba,Namespace:calico-system,Attempt:0,} returns sandbox id \"66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c\"" May 27 17:05:16.960685 kubelet[2795]: E0527 17:05:16.960667 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.961021 kubelet[2795]: W0527 17:05:16.960969 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.961021 kubelet[2795]: E0527 17:05:16.960992 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:16.976508 kubelet[2795]: E0527 17:05:16.976463 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:16.976835 kubelet[2795]: W0527 17:05:16.976758 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:16.976835 kubelet[2795]: E0527 17:05:16.976793 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:18.370956 kubelet[2795]: E0527 17:05:18.370847 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sx6tt" podUID="6a4311e4-69eb-4fe5-a407-eaf19f301066" May 27 17:05:18.767173 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2058496040.mount: Deactivated successfully. May 27 17:05:19.777017 containerd[1514]: time="2025-05-27T17:05:19.776956225Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:19.778178 containerd[1514]: time="2025-05-27T17:05:19.778132144Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=33020269" May 27 17:05:19.779071 containerd[1514]: time="2025-05-27T17:05:19.779032064Z" level=info msg="ImageCreate event name:\"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:19.781845 containerd[1514]: time="2025-05-27T17:05:19.781785384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:19.782938 containerd[1514]: time="2025-05-27T17:05:19.782891783Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"33020123\" in 2.944326654s" May 27 17:05:19.782938 containerd[1514]: time="2025-05-27T17:05:19.782926903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\"" May 27 17:05:19.784857 containerd[1514]: time="2025-05-27T17:05:19.784644903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 17:05:19.811949 containerd[1514]: time="2025-05-27T17:05:19.811879176Z" level=info msg="CreateContainer within sandbox \"a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 17:05:19.821382 containerd[1514]: time="2025-05-27T17:05:19.820008414Z" level=info msg="Container 8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:19.833142 containerd[1514]: time="2025-05-27T17:05:19.833067971Z" level=info msg="CreateContainer within sandbox \"a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956\"" May 27 17:05:19.834857 containerd[1514]: time="2025-05-27T17:05:19.834555531Z" level=info msg="StartContainer for \"8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956\"" May 27 17:05:19.836209 containerd[1514]: time="2025-05-27T17:05:19.836170211Z" level=info msg="connecting to shim 8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956" address="unix:///run/containerd/s/70df3b1af357f04baf97d6d6535d00d11ed84eccca5bff30b1bb1cf926cfde52" protocol=ttrpc version=3 May 27 17:05:19.863929 systemd[1]: Started cri-containerd-8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956.scope - libcontainer container 8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956. May 27 17:05:19.920466 containerd[1514]: time="2025-05-27T17:05:19.920419591Z" level=info msg="StartContainer for \"8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956\" returns successfully" May 27 17:05:20.370189 kubelet[2795]: E0527 17:05:20.370083 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sx6tt" podUID="6a4311e4-69eb-4fe5-a407-eaf19f301066" May 27 17:05:20.532987 kubelet[2795]: E0527 17:05:20.532779 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.532987 kubelet[2795]: W0527 17:05:20.532818 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.532987 kubelet[2795]: E0527 17:05:20.532846 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.533350 kubelet[2795]: E0527 17:05:20.533328 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.533560 kubelet[2795]: W0527 17:05:20.533482 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.533891 kubelet[2795]: E0527 17:05:20.533731 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.534195 kubelet[2795]: E0527 17:05:20.534168 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.534290 kubelet[2795]: W0527 17:05:20.534275 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.534408 kubelet[2795]: E0527 17:05:20.534391 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.535438 kubelet[2795]: E0527 17:05:20.535420 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.535696 kubelet[2795]: W0527 17:05:20.535534 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.535696 kubelet[2795]: E0527 17:05:20.535556 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.535905 kubelet[2795]: E0527 17:05:20.535860 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.535905 kubelet[2795]: W0527 17:05:20.535876 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.536035 kubelet[2795]: E0527 17:05:20.536019 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.536707 kubelet[2795]: E0527 17:05:20.536641 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.536707 kubelet[2795]: W0527 17:05:20.536660 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.536707 kubelet[2795]: E0527 17:05:20.536673 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.537214 kubelet[2795]: E0527 17:05:20.537099 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.537214 kubelet[2795]: W0527 17:05:20.537115 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.537214 kubelet[2795]: E0527 17:05:20.537128 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.537426 kubelet[2795]: E0527 17:05:20.537413 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.537492 kubelet[2795]: W0527 17:05:20.537481 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.537555 kubelet[2795]: E0527 17:05:20.537536 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.537988 kubelet[2795]: E0527 17:05:20.537929 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.537988 kubelet[2795]: W0527 17:05:20.537946 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.537988 kubelet[2795]: E0527 17:05:20.537956 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.538276 kubelet[2795]: E0527 17:05:20.538217 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.538276 kubelet[2795]: W0527 17:05:20.538230 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.538276 kubelet[2795]: E0527 17:05:20.538240 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.538596 kubelet[2795]: E0527 17:05:20.538513 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.538596 kubelet[2795]: W0527 17:05:20.538526 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.538596 kubelet[2795]: E0527 17:05:20.538535 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.538907 kubelet[2795]: E0527 17:05:20.538888 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.539131 kubelet[2795]: W0527 17:05:20.538987 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.539131 kubelet[2795]: E0527 17:05:20.539009 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.539292 kubelet[2795]: E0527 17:05:20.539277 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.539403 kubelet[2795]: W0527 17:05:20.539353 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.539490 kubelet[2795]: E0527 17:05:20.539475 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.539916 kubelet[2795]: E0527 17:05:20.539844 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.539916 kubelet[2795]: W0527 17:05:20.539860 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.539916 kubelet[2795]: E0527 17:05:20.539874 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.540184 kubelet[2795]: E0527 17:05:20.540172 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.540299 kubelet[2795]: W0527 17:05:20.540233 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.540299 kubelet[2795]: E0527 17:05:20.540248 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.570980 kubelet[2795]: E0527 17:05:20.570871 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.570980 kubelet[2795]: W0527 17:05:20.570900 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.570980 kubelet[2795]: E0527 17:05:20.570923 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.571226 kubelet[2795]: E0527 17:05:20.571161 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.571226 kubelet[2795]: W0527 17:05:20.571171 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.571226 kubelet[2795]: E0527 17:05:20.571183 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.571453 kubelet[2795]: E0527 17:05:20.571434 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.571521 kubelet[2795]: W0527 17:05:20.571453 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.571521 kubelet[2795]: E0527 17:05:20.571490 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.571908 kubelet[2795]: E0527 17:05:20.571888 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.571985 kubelet[2795]: W0527 17:05:20.571926 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.571985 kubelet[2795]: E0527 17:05:20.571946 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.572226 kubelet[2795]: E0527 17:05:20.572208 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.572226 kubelet[2795]: W0527 17:05:20.572224 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.572339 kubelet[2795]: E0527 17:05:20.572238 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.572509 kubelet[2795]: E0527 17:05:20.572494 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.572560 kubelet[2795]: W0527 17:05:20.572510 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.572560 kubelet[2795]: E0527 17:05:20.572523 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.572794 kubelet[2795]: E0527 17:05:20.572770 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.572794 kubelet[2795]: W0527 17:05:20.572786 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.573229 kubelet[2795]: E0527 17:05:20.572806 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.573229 kubelet[2795]: E0527 17:05:20.573005 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.573229 kubelet[2795]: W0527 17:05:20.573019 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.573229 kubelet[2795]: E0527 17:05:20.573030 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.573229 kubelet[2795]: E0527 17:05:20.573202 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.573229 kubelet[2795]: W0527 17:05:20.573211 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.573229 kubelet[2795]: E0527 17:05:20.573222 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.573635 kubelet[2795]: E0527 17:05:20.573419 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.573635 kubelet[2795]: W0527 17:05:20.573431 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.573635 kubelet[2795]: E0527 17:05:20.573442 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.573635 kubelet[2795]: E0527 17:05:20.573618 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.573635 kubelet[2795]: W0527 17:05:20.573629 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.573635 kubelet[2795]: E0527 17:05:20.573640 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.573879 kubelet[2795]: E0527 17:05:20.573838 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.573879 kubelet[2795]: W0527 17:05:20.573848 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.573879 kubelet[2795]: E0527 17:05:20.573860 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.574333 kubelet[2795]: E0527 17:05:20.574293 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.574333 kubelet[2795]: W0527 17:05:20.574309 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.574333 kubelet[2795]: E0527 17:05:20.574322 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.574634 kubelet[2795]: E0527 17:05:20.574534 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.574634 kubelet[2795]: W0527 17:05:20.574545 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.574634 kubelet[2795]: E0527 17:05:20.574557 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.574792 kubelet[2795]: E0527 17:05:20.574762 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.574792 kubelet[2795]: W0527 17:05:20.574773 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.574894 kubelet[2795]: E0527 17:05:20.574819 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.575061 kubelet[2795]: E0527 17:05:20.575041 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.575061 kubelet[2795]: W0527 17:05:20.575058 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.575178 kubelet[2795]: E0527 17:05:20.575072 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.575412 kubelet[2795]: E0527 17:05:20.575349 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.575412 kubelet[2795]: W0527 17:05:20.575402 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.575551 kubelet[2795]: E0527 17:05:20.575419 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:20.576155 kubelet[2795]: E0527 17:05:20.576103 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:05:20.576155 kubelet[2795]: W0527 17:05:20.576127 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:05:20.576155 kubelet[2795]: E0527 17:05:20.576143 2795 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:05:21.327459 containerd[1514]: time="2025-05-27T17:05:21.326467499Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:21.328000 containerd[1514]: time="2025-05-27T17:05:21.327949499Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4264304" May 27 17:05:21.328543 containerd[1514]: time="2025-05-27T17:05:21.328508779Z" level=info msg="ImageCreate event name:\"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:21.332093 containerd[1514]: time="2025-05-27T17:05:21.332033538Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:21.332867 containerd[1514]: time="2025-05-27T17:05:21.332823298Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5633505\" in 1.548145235s" May 27 17:05:21.332963 containerd[1514]: time="2025-05-27T17:05:21.332946738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\"" May 27 17:05:21.338670 containerd[1514]: time="2025-05-27T17:05:21.338616417Z" level=info msg="CreateContainer within sandbox \"66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 17:05:21.350386 containerd[1514]: time="2025-05-27T17:05:21.346557655Z" level=info msg="Container 58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:21.360440 containerd[1514]: time="2025-05-27T17:05:21.360378211Z" level=info msg="CreateContainer within sandbox \"66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67\"" May 27 17:05:21.361881 containerd[1514]: time="2025-05-27T17:05:21.361836451Z" level=info msg="StartContainer for \"58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67\"" May 27 17:05:21.365523 containerd[1514]: time="2025-05-27T17:05:21.365461170Z" level=info msg="connecting to shim 58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67" address="unix:///run/containerd/s/9b7cb97e0dab7cb7540ae32cd84242f9070527a884fb1cdc8feea45ec78a6a5d" protocol=ttrpc version=3 May 27 17:05:21.400723 systemd[1]: Started cri-containerd-58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67.scope - libcontainer container 58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67. May 27 17:05:21.451627 containerd[1514]: time="2025-05-27T17:05:21.451239510Z" level=info msg="StartContainer for \"58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67\" returns successfully" May 27 17:05:21.462818 systemd[1]: cri-containerd-58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67.scope: Deactivated successfully. May 27 17:05:21.466632 containerd[1514]: time="2025-05-27T17:05:21.466520187Z" level=info msg="received exit event container_id:\"58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67\" id:\"58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67\" pid:3479 exited_at:{seconds:1748365521 nanos:465938387}" May 27 17:05:21.466862 containerd[1514]: time="2025-05-27T17:05:21.466585867Z" level=info msg="TaskExit event in podsandbox handler container_id:\"58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67\" id:\"58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67\" pid:3479 exited_at:{seconds:1748365521 nanos:465938387}" May 27 17:05:21.497976 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67-rootfs.mount: Deactivated successfully. May 27 17:05:21.515326 kubelet[2795]: I0527 17:05:21.515294 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:05:21.548222 kubelet[2795]: I0527 17:05:21.545335 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-57556dd858-vlgnd" podStartSLOduration=2.599572394 podStartE2EDuration="5.545318488s" podCreationTimestamp="2025-05-27 17:05:16 +0000 UTC" firstStartedPulling="2025-05-27 17:05:16.838024809 +0000 UTC m=+25.611758258" lastFinishedPulling="2025-05-27 17:05:19.783770903 +0000 UTC m=+28.557504352" observedRunningTime="2025-05-27 17:05:20.525221848 +0000 UTC m=+29.298955297" watchObservedRunningTime="2025-05-27 17:05:21.545318488 +0000 UTC m=+30.319051937" May 27 17:05:22.371247 kubelet[2795]: E0527 17:05:22.371183 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sx6tt" podUID="6a4311e4-69eb-4fe5-a407-eaf19f301066" May 27 17:05:22.525665 containerd[1514]: time="2025-05-27T17:05:22.525619139Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 17:05:24.370639 kubelet[2795]: E0527 17:05:24.370553 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sx6tt" podUID="6a4311e4-69eb-4fe5-a407-eaf19f301066" May 27 17:05:26.174978 containerd[1514]: time="2025-05-27T17:05:26.174617541Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:26.176601 containerd[1514]: time="2025-05-27T17:05:26.176489380Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=65748976" May 27 17:05:26.177781 containerd[1514]: time="2025-05-27T17:05:26.177663940Z" level=info msg="ImageCreate event name:\"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:26.182197 containerd[1514]: time="2025-05-27T17:05:26.182116019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:26.182976 containerd[1514]: time="2025-05-27T17:05:26.182855459Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"67118217\" in 3.65719176s" May 27 17:05:26.182976 containerd[1514]: time="2025-05-27T17:05:26.182896019Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\"" May 27 17:05:26.192057 containerd[1514]: time="2025-05-27T17:05:26.192016737Z" level=info msg="CreateContainer within sandbox \"66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 17:05:26.206066 containerd[1514]: time="2025-05-27T17:05:26.205584054Z" level=info msg="Container 9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:26.226683 containerd[1514]: time="2025-05-27T17:05:26.226616609Z" level=info msg="CreateContainer within sandbox \"66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6\"" May 27 17:05:26.229405 containerd[1514]: time="2025-05-27T17:05:26.227515809Z" level=info msg="StartContainer for \"9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6\"" May 27 17:05:26.231098 containerd[1514]: time="2025-05-27T17:05:26.231057568Z" level=info msg="connecting to shim 9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6" address="unix:///run/containerd/s/9b7cb97e0dab7cb7540ae32cd84242f9070527a884fb1cdc8feea45ec78a6a5d" protocol=ttrpc version=3 May 27 17:05:26.256603 systemd[1]: Started cri-containerd-9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6.scope - libcontainer container 9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6. May 27 17:05:26.302494 containerd[1514]: time="2025-05-27T17:05:26.302435392Z" level=info msg="StartContainer for \"9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6\" returns successfully" May 27 17:05:26.372225 kubelet[2795]: E0527 17:05:26.372036 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sx6tt" podUID="6a4311e4-69eb-4fe5-a407-eaf19f301066" May 27 17:05:26.864801 containerd[1514]: time="2025-05-27T17:05:26.864741544Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 17:05:26.868462 systemd[1]: cri-containerd-9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6.scope: Deactivated successfully. May 27 17:05:26.871518 systemd[1]: cri-containerd-9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6.scope: Consumed 506ms CPU time, 185.8M memory peak, 165.5M written to disk. May 27 17:05:26.873672 containerd[1514]: time="2025-05-27T17:05:26.873327982Z" level=info msg="received exit event container_id:\"9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6\" id:\"9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6\" pid:3538 exited_at:{seconds:1748365526 nanos:871525303}" May 27 17:05:26.874653 containerd[1514]: time="2025-05-27T17:05:26.874270102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6\" id:\"9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6\" pid:3538 exited_at:{seconds:1748365526 nanos:871525303}" May 27 17:05:26.900122 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6-rootfs.mount: Deactivated successfully. May 27 17:05:26.924392 kubelet[2795]: I0527 17:05:26.923793 2795 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 17:05:27.019271 systemd[1]: Created slice kubepods-besteffort-pod2fe1988c_87b9_4d51_b169_ff7ad6d492ef.slice - libcontainer container kubepods-besteffort-pod2fe1988c_87b9_4d51_b169_ff7ad6d492ef.slice. May 27 17:05:27.037799 systemd[1]: Created slice kubepods-burstable-pod05f26e54_5384_4d57_8899_41e4fa10bcb5.slice - libcontainer container kubepods-burstable-pod05f26e54_5384_4d57_8899_41e4fa10bcb5.slice. May 27 17:05:27.052860 systemd[1]: Created slice kubepods-besteffort-pod2f1591b1_a33b_436b_97d2_a3ba5959f3d8.slice - libcontainer container kubepods-besteffort-pod2f1591b1_a33b_436b_97d2_a3ba5959f3d8.slice. May 27 17:05:27.063201 systemd[1]: Created slice kubepods-besteffort-pod45d42809_0456_4761_94f0_815274f2dcfd.slice - libcontainer container kubepods-besteffort-pod45d42809_0456_4761_94f0_815274f2dcfd.slice. May 27 17:05:27.074991 systemd[1]: Created slice kubepods-burstable-podb52118ae_5b8b_4a0f_8f30_dd827acc27ac.slice - libcontainer container kubepods-burstable-podb52118ae_5b8b_4a0f_8f30_dd827acc27ac.slice. May 27 17:05:27.086641 systemd[1]: Created slice kubepods-besteffort-podea0a16a8_4944_4683_af93_caab94c7fa78.slice - libcontainer container kubepods-besteffort-podea0a16a8_4944_4683_af93_caab94c7fa78.slice. May 27 17:05:27.098037 systemd[1]: Created slice kubepods-besteffort-pod58b197a4_a9f4_4e6c_b4d2_415b559fda41.slice - libcontainer container kubepods-besteffort-pod58b197a4_a9f4_4e6c_b4d2_415b559fda41.slice. May 27 17:05:27.107846 systemd[1]: Created slice kubepods-besteffort-podf87305a5_4340_4e54_9029_d480976de92f.slice - libcontainer container kubepods-besteffort-podf87305a5_4340_4e54_9029_d480976de92f.slice. May 27 17:05:27.124395 kubelet[2795]: I0527 17:05:27.124152 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlnvw\" (UniqueName: \"kubernetes.io/projected/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-kube-api-access-qlnvw\") pod \"calico-apiserver-6c66494f6d-gc4dc\" (UID: \"2f1591b1-a33b-436b-97d2-a3ba5959f3d8\") " pod="calico-apiserver/calico-apiserver-6c66494f6d-gc4dc" May 27 17:05:27.124628 kubelet[2795]: I0527 17:05:27.124558 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-472x8\" (UniqueName: \"kubernetes.io/projected/b52118ae-5b8b-4a0f-8f30-dd827acc27ac-kube-api-access-472x8\") pod \"coredns-674b8bbfcf-gv5xq\" (UID: \"b52118ae-5b8b-4a0f-8f30-dd827acc27ac\") " pod="kube-system/coredns-674b8bbfcf-gv5xq" May 27 17:05:27.124759 kubelet[2795]: I0527 17:05:27.124717 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f87305a5-4340-4e54-9029-d480976de92f-calico-apiserver-certs\") pod \"calico-apiserver-6c66494f6d-72shk\" (UID: \"f87305a5-4340-4e54-9029-d480976de92f\") " pod="calico-apiserver/calico-apiserver-6c66494f6d-72shk" May 27 17:05:27.124825 kubelet[2795]: I0527 17:05:27.124813 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ea0a16a8-4944-4683-af93-caab94c7fa78-calico-apiserver-certs\") pod \"calico-apiserver-5f784fdc78-tq7f7\" (UID: \"ea0a16a8-4944-4683-af93-caab94c7fa78\") " pod="calico-apiserver/calico-apiserver-5f784fdc78-tq7f7" May 27 17:05:27.124991 kubelet[2795]: I0527 17:05:27.124977 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/45d42809-0456-4761-94f0-815274f2dcfd-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-lflfc\" (UID: \"45d42809-0456-4761-94f0-815274f2dcfd\") " pod="calico-system/goldmane-78d55f7ddc-lflfc" May 27 17:05:27.125083 kubelet[2795]: I0527 17:05:27.125068 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-backend-key-pair\") pod \"whisker-5dddcc4c9c-b98v7\" (UID: \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\") " pod="calico-system/whisker-5dddcc4c9c-b98v7" May 27 17:05:27.125255 kubelet[2795]: I0527 17:05:27.125195 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgn2d\" (UniqueName: \"kubernetes.io/projected/f87305a5-4340-4e54-9029-d480976de92f-kube-api-access-bgn2d\") pod \"calico-apiserver-6c66494f6d-72shk\" (UID: \"f87305a5-4340-4e54-9029-d480976de92f\") " pod="calico-apiserver/calico-apiserver-6c66494f6d-72shk" May 27 17:05:27.125255 kubelet[2795]: I0527 17:05:27.125222 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkff\" (UniqueName: \"kubernetes.io/projected/ea0a16a8-4944-4683-af93-caab94c7fa78-kube-api-access-2dkff\") pod \"calico-apiserver-5f784fdc78-tq7f7\" (UID: \"ea0a16a8-4944-4683-af93-caab94c7fa78\") " pod="calico-apiserver/calico-apiserver-5f784fdc78-tq7f7" May 27 17:05:27.125391 kubelet[2795]: I0527 17:05:27.125378 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b197a4-a9f4-4e6c-b4d2-415b559fda41-tigera-ca-bundle\") pod \"calico-kube-controllers-65b8755655-qrgpn\" (UID: \"58b197a4-a9f4-4e6c-b4d2-415b559fda41\") " pod="calico-system/calico-kube-controllers-65b8755655-qrgpn" May 27 17:05:27.125548 kubelet[2795]: I0527 17:05:27.125467 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-299nx\" (UniqueName: \"kubernetes.io/projected/58b197a4-a9f4-4e6c-b4d2-415b559fda41-kube-api-access-299nx\") pod \"calico-kube-controllers-65b8755655-qrgpn\" (UID: \"58b197a4-a9f4-4e6c-b4d2-415b559fda41\") " pod="calico-system/calico-kube-controllers-65b8755655-qrgpn" May 27 17:05:27.125838 kubelet[2795]: I0527 17:05:27.125809 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b52118ae-5b8b-4a0f-8f30-dd827acc27ac-config-volume\") pod \"coredns-674b8bbfcf-gv5xq\" (UID: \"b52118ae-5b8b-4a0f-8f30-dd827acc27ac\") " pod="kube-system/coredns-674b8bbfcf-gv5xq" May 27 17:05:27.126090 kubelet[2795]: I0527 17:05:27.126052 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-calico-apiserver-certs\") pod \"calico-apiserver-6c66494f6d-gc4dc\" (UID: \"2f1591b1-a33b-436b-97d2-a3ba5959f3d8\") " pod="calico-apiserver/calico-apiserver-6c66494f6d-gc4dc" May 27 17:05:27.126233 kubelet[2795]: I0527 17:05:27.126178 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05f26e54-5384-4d57-8899-41e4fa10bcb5-config-volume\") pod \"coredns-674b8bbfcf-gsfg7\" (UID: \"05f26e54-5384-4d57-8899-41e4fa10bcb5\") " pod="kube-system/coredns-674b8bbfcf-gsfg7" May 27 17:05:27.126233 kubelet[2795]: I0527 17:05:27.126205 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d42809-0456-4761-94f0-815274f2dcfd-config\") pod \"goldmane-78d55f7ddc-lflfc\" (UID: \"45d42809-0456-4761-94f0-815274f2dcfd\") " pod="calico-system/goldmane-78d55f7ddc-lflfc" May 27 17:05:27.126389 kubelet[2795]: I0527 17:05:27.126353 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pntql\" (UniqueName: \"kubernetes.io/projected/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-kube-api-access-pntql\") pod \"whisker-5dddcc4c9c-b98v7\" (UID: \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\") " pod="calico-system/whisker-5dddcc4c9c-b98v7" May 27 17:05:27.126490 kubelet[2795]: I0527 17:05:27.126473 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2zd\" (UniqueName: \"kubernetes.io/projected/05f26e54-5384-4d57-8899-41e4fa10bcb5-kube-api-access-kn2zd\") pod \"coredns-674b8bbfcf-gsfg7\" (UID: \"05f26e54-5384-4d57-8899-41e4fa10bcb5\") " pod="kube-system/coredns-674b8bbfcf-gsfg7" May 27 17:05:27.126676 kubelet[2795]: I0527 17:05:27.126618 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d42809-0456-4761-94f0-815274f2dcfd-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-lflfc\" (UID: \"45d42809-0456-4761-94f0-815274f2dcfd\") " pod="calico-system/goldmane-78d55f7ddc-lflfc" May 27 17:05:27.126676 kubelet[2795]: I0527 17:05:27.126649 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqjv\" (UniqueName: \"kubernetes.io/projected/45d42809-0456-4761-94f0-815274f2dcfd-kube-api-access-2mqjv\") pod \"goldmane-78d55f7ddc-lflfc\" (UID: \"45d42809-0456-4761-94f0-815274f2dcfd\") " pod="calico-system/goldmane-78d55f7ddc-lflfc" May 27 17:05:27.126837 kubelet[2795]: I0527 17:05:27.126784 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-ca-bundle\") pod \"whisker-5dddcc4c9c-b98v7\" (UID: \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\") " pod="calico-system/whisker-5dddcc4c9c-b98v7" May 27 17:05:27.332980 containerd[1514]: time="2025-05-27T17:05:27.332920639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dddcc4c9c-b98v7,Uid:2fe1988c-87b9-4d51-b169-ff7ad6d492ef,Namespace:calico-system,Attempt:0,}" May 27 17:05:27.348679 containerd[1514]: time="2025-05-27T17:05:27.348587395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gsfg7,Uid:05f26e54-5384-4d57-8899-41e4fa10bcb5,Namespace:kube-system,Attempt:0,}" May 27 17:05:27.359377 containerd[1514]: time="2025-05-27T17:05:27.359280473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-gc4dc,Uid:2f1591b1-a33b-436b-97d2-a3ba5959f3d8,Namespace:calico-apiserver,Attempt:0,}" May 27 17:05:27.372317 containerd[1514]: time="2025-05-27T17:05:27.371173550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-lflfc,Uid:45d42809-0456-4761-94f0-815274f2dcfd,Namespace:calico-system,Attempt:0,}" May 27 17:05:27.380854 containerd[1514]: time="2025-05-27T17:05:27.380748588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gv5xq,Uid:b52118ae-5b8b-4a0f-8f30-dd827acc27ac,Namespace:kube-system,Attempt:0,}" May 27 17:05:27.394253 containerd[1514]: time="2025-05-27T17:05:27.394214025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f784fdc78-tq7f7,Uid:ea0a16a8-4944-4683-af93-caab94c7fa78,Namespace:calico-apiserver,Attempt:0,}" May 27 17:05:27.407709 containerd[1514]: time="2025-05-27T17:05:27.407659622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65b8755655-qrgpn,Uid:58b197a4-a9f4-4e6c-b4d2-415b559fda41,Namespace:calico-system,Attempt:0,}" May 27 17:05:27.414409 containerd[1514]: time="2025-05-27T17:05:27.414344220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-72shk,Uid:f87305a5-4340-4e54-9029-d480976de92f,Namespace:calico-apiserver,Attempt:0,}" May 27 17:05:27.564391 containerd[1514]: time="2025-05-27T17:05:27.561439867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 17:05:27.575052 containerd[1514]: time="2025-05-27T17:05:27.574914224Z" level=error msg="Failed to destroy network for sandbox \"e106b2d2513a819230aa6743fae3fcad4f6a25312cac34ff6f8e08a7e4170d8f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.581686 containerd[1514]: time="2025-05-27T17:05:27.581514903Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gsfg7,Uid:05f26e54-5384-4d57-8899-41e4fa10bcb5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e106b2d2513a819230aa6743fae3fcad4f6a25312cac34ff6f8e08a7e4170d8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.587428 kubelet[2795]: E0527 17:05:27.586702 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e106b2d2513a819230aa6743fae3fcad4f6a25312cac34ff6f8e08a7e4170d8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.587428 kubelet[2795]: E0527 17:05:27.586782 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e106b2d2513a819230aa6743fae3fcad4f6a25312cac34ff6f8e08a7e4170d8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gsfg7" May 27 17:05:27.587428 kubelet[2795]: E0527 17:05:27.586806 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e106b2d2513a819230aa6743fae3fcad4f6a25312cac34ff6f8e08a7e4170d8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gsfg7" May 27 17:05:27.587897 kubelet[2795]: E0527 17:05:27.586856 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-gsfg7_kube-system(05f26e54-5384-4d57-8899-41e4fa10bcb5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-gsfg7_kube-system(05f26e54-5384-4d57-8899-41e4fa10bcb5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e106b2d2513a819230aa6743fae3fcad4f6a25312cac34ff6f8e08a7e4170d8f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-gsfg7" podUID="05f26e54-5384-4d57-8899-41e4fa10bcb5" May 27 17:05:27.609690 containerd[1514]: time="2025-05-27T17:05:27.609559736Z" level=error msg="Failed to destroy network for sandbox \"21440f1aec988d00ae04f6c6c477fabcaf650b6c77d4e0ed565c5db5b20f548c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.612958 containerd[1514]: time="2025-05-27T17:05:27.612896455Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dddcc4c9c-b98v7,Uid:2fe1988c-87b9-4d51-b169-ff7ad6d492ef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"21440f1aec988d00ae04f6c6c477fabcaf650b6c77d4e0ed565c5db5b20f548c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.613460 kubelet[2795]: E0527 17:05:27.613307 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21440f1aec988d00ae04f6c6c477fabcaf650b6c77d4e0ed565c5db5b20f548c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.614281 kubelet[2795]: E0527 17:05:27.613865 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21440f1aec988d00ae04f6c6c477fabcaf650b6c77d4e0ed565c5db5b20f548c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dddcc4c9c-b98v7" May 27 17:05:27.614281 kubelet[2795]: E0527 17:05:27.613897 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21440f1aec988d00ae04f6c6c477fabcaf650b6c77d4e0ed565c5db5b20f548c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dddcc4c9c-b98v7" May 27 17:05:27.614281 kubelet[2795]: E0527 17:05:27.613945 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5dddcc4c9c-b98v7_calico-system(2fe1988c-87b9-4d51-b169-ff7ad6d492ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5dddcc4c9c-b98v7_calico-system(2fe1988c-87b9-4d51-b169-ff7ad6d492ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"21440f1aec988d00ae04f6c6c477fabcaf650b6c77d4e0ed565c5db5b20f548c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5dddcc4c9c-b98v7" podUID="2fe1988c-87b9-4d51-b169-ff7ad6d492ef" May 27 17:05:27.623860 containerd[1514]: time="2025-05-27T17:05:27.623816093Z" level=error msg="Failed to destroy network for sandbox \"21935be5468c46ceace405386bd4d4b0311b3190041d34162593ea00fd327ba5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.625637 containerd[1514]: time="2025-05-27T17:05:27.625492573Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gv5xq,Uid:b52118ae-5b8b-4a0f-8f30-dd827acc27ac,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"21935be5468c46ceace405386bd4d4b0311b3190041d34162593ea00fd327ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.625810 kubelet[2795]: E0527 17:05:27.625756 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21935be5468c46ceace405386bd4d4b0311b3190041d34162593ea00fd327ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.625854 kubelet[2795]: E0527 17:05:27.625813 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21935be5468c46ceace405386bd4d4b0311b3190041d34162593ea00fd327ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gv5xq" May 27 17:05:27.625854 kubelet[2795]: E0527 17:05:27.625832 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21935be5468c46ceace405386bd4d4b0311b3190041d34162593ea00fd327ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gv5xq" May 27 17:05:27.626028 kubelet[2795]: E0527 17:05:27.625885 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-gv5xq_kube-system(b52118ae-5b8b-4a0f-8f30-dd827acc27ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-gv5xq_kube-system(b52118ae-5b8b-4a0f-8f30-dd827acc27ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"21935be5468c46ceace405386bd4d4b0311b3190041d34162593ea00fd327ba5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-gv5xq" podUID="b52118ae-5b8b-4a0f-8f30-dd827acc27ac" May 27 17:05:27.647865 containerd[1514]: time="2025-05-27T17:05:27.646703928Z" level=error msg="Failed to destroy network for sandbox \"ed0922c7ea3779f0e66664f3d427f42b0bce26886c4fa46118baf2c21656acad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.653211 containerd[1514]: time="2025-05-27T17:05:27.653155326Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f784fdc78-tq7f7,Uid:ea0a16a8-4944-4683-af93-caab94c7fa78,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed0922c7ea3779f0e66664f3d427f42b0bce26886c4fa46118baf2c21656acad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.653995 kubelet[2795]: E0527 17:05:27.653929 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed0922c7ea3779f0e66664f3d427f42b0bce26886c4fa46118baf2c21656acad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.653995 kubelet[2795]: E0527 17:05:27.653993 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed0922c7ea3779f0e66664f3d427f42b0bce26886c4fa46118baf2c21656acad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f784fdc78-tq7f7" May 27 17:05:27.654216 kubelet[2795]: E0527 17:05:27.654013 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed0922c7ea3779f0e66664f3d427f42b0bce26886c4fa46118baf2c21656acad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f784fdc78-tq7f7" May 27 17:05:27.654216 kubelet[2795]: E0527 17:05:27.654065 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f784fdc78-tq7f7_calico-apiserver(ea0a16a8-4944-4683-af93-caab94c7fa78)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f784fdc78-tq7f7_calico-apiserver(ea0a16a8-4944-4683-af93-caab94c7fa78)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed0922c7ea3779f0e66664f3d427f42b0bce26886c4fa46118baf2c21656acad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f784fdc78-tq7f7" podUID="ea0a16a8-4944-4683-af93-caab94c7fa78" May 27 17:05:27.655475 containerd[1514]: time="2025-05-27T17:05:27.655124926Z" level=error msg="Failed to destroy network for sandbox \"9ca2152fad3ddc3fe477bb71612de45e807f2128dcbaccc0194214bc11b175a4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.656131 containerd[1514]: time="2025-05-27T17:05:27.656019766Z" level=error msg="Failed to destroy network for sandbox \"c97573a4c73ed18e8b80711fff54bf6e159d4b7becaeae24c2b3db2d1aebae10\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.657988 containerd[1514]: time="2025-05-27T17:05:27.656279606Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-lflfc,Uid:45d42809-0456-4761-94f0-815274f2dcfd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ca2152fad3ddc3fe477bb71612de45e807f2128dcbaccc0194214bc11b175a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.658391 kubelet[2795]: E0527 17:05:27.658200 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ca2152fad3ddc3fe477bb71612de45e807f2128dcbaccc0194214bc11b175a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.658391 kubelet[2795]: E0527 17:05:27.658250 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ca2152fad3ddc3fe477bb71612de45e807f2128dcbaccc0194214bc11b175a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-lflfc" May 27 17:05:27.658391 kubelet[2795]: E0527 17:05:27.658271 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ca2152fad3ddc3fe477bb71612de45e807f2128dcbaccc0194214bc11b175a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-lflfc" May 27 17:05:27.658503 kubelet[2795]: E0527 17:05:27.658323 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-lflfc_calico-system(45d42809-0456-4761-94f0-815274f2dcfd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-lflfc_calico-system(45d42809-0456-4761-94f0-815274f2dcfd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ca2152fad3ddc3fe477bb71612de45e807f2128dcbaccc0194214bc11b175a4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:05:27.659250 containerd[1514]: time="2025-05-27T17:05:27.659073085Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65b8755655-qrgpn,Uid:58b197a4-a9f4-4e6c-b4d2-415b559fda41,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c97573a4c73ed18e8b80711fff54bf6e159d4b7becaeae24c2b3db2d1aebae10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.659621 kubelet[2795]: E0527 17:05:27.659469 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c97573a4c73ed18e8b80711fff54bf6e159d4b7becaeae24c2b3db2d1aebae10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.659621 kubelet[2795]: E0527 17:05:27.659512 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c97573a4c73ed18e8b80711fff54bf6e159d4b7becaeae24c2b3db2d1aebae10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65b8755655-qrgpn" May 27 17:05:27.659621 kubelet[2795]: E0527 17:05:27.659568 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c97573a4c73ed18e8b80711fff54bf6e159d4b7becaeae24c2b3db2d1aebae10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65b8755655-qrgpn" May 27 17:05:27.659759 kubelet[2795]: E0527 17:05:27.659620 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-65b8755655-qrgpn_calico-system(58b197a4-a9f4-4e6c-b4d2-415b559fda41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-65b8755655-qrgpn_calico-system(58b197a4-a9f4-4e6c-b4d2-415b559fda41)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c97573a4c73ed18e8b80711fff54bf6e159d4b7becaeae24c2b3db2d1aebae10\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-65b8755655-qrgpn" podUID="58b197a4-a9f4-4e6c-b4d2-415b559fda41" May 27 17:05:27.666681 containerd[1514]: time="2025-05-27T17:05:27.666632283Z" level=error msg="Failed to destroy network for sandbox \"ff059dbf704d0bd75fddce867cdc7465b74193c47468b35b8dab807e2df4a36d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.669072 containerd[1514]: time="2025-05-27T17:05:27.668830803Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-gc4dc,Uid:2f1591b1-a33b-436b-97d2-a3ba5959f3d8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff059dbf704d0bd75fddce867cdc7465b74193c47468b35b8dab807e2df4a36d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.669398 kubelet[2795]: E0527 17:05:27.669317 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff059dbf704d0bd75fddce867cdc7465b74193c47468b35b8dab807e2df4a36d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.669768 kubelet[2795]: E0527 17:05:27.669742 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff059dbf704d0bd75fddce867cdc7465b74193c47468b35b8dab807e2df4a36d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c66494f6d-gc4dc" May 27 17:05:27.669961 kubelet[2795]: E0527 17:05:27.669855 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff059dbf704d0bd75fddce867cdc7465b74193c47468b35b8dab807e2df4a36d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c66494f6d-gc4dc" May 27 17:05:27.669961 kubelet[2795]: E0527 17:05:27.669918 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c66494f6d-gc4dc_calico-apiserver(2f1591b1-a33b-436b-97d2-a3ba5959f3d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c66494f6d-gc4dc_calico-apiserver(2f1591b1-a33b-436b-97d2-a3ba5959f3d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ff059dbf704d0bd75fddce867cdc7465b74193c47468b35b8dab807e2df4a36d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c66494f6d-gc4dc" podUID="2f1591b1-a33b-436b-97d2-a3ba5959f3d8" May 27 17:05:27.680425 containerd[1514]: time="2025-05-27T17:05:27.680350280Z" level=error msg="Failed to destroy network for sandbox \"eb3a5bb979a8b2114a8a9342a635c3e6888bdede502da01a0ec612837899f183\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.681965 containerd[1514]: time="2025-05-27T17:05:27.681917880Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-72shk,Uid:f87305a5-4340-4e54-9029-d480976de92f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3a5bb979a8b2114a8a9342a635c3e6888bdede502da01a0ec612837899f183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.682444 kubelet[2795]: E0527 17:05:27.682347 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3a5bb979a8b2114a8a9342a635c3e6888bdede502da01a0ec612837899f183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:27.682584 kubelet[2795]: E0527 17:05:27.682464 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3a5bb979a8b2114a8a9342a635c3e6888bdede502da01a0ec612837899f183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c66494f6d-72shk" May 27 17:05:27.682584 kubelet[2795]: E0527 17:05:27.682492 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3a5bb979a8b2114a8a9342a635c3e6888bdede502da01a0ec612837899f183\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c66494f6d-72shk" May 27 17:05:27.682657 kubelet[2795]: E0527 17:05:27.682606 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c66494f6d-72shk_calico-apiserver(f87305a5-4340-4e54-9029-d480976de92f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c66494f6d-72shk_calico-apiserver(f87305a5-4340-4e54-9029-d480976de92f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb3a5bb979a8b2114a8a9342a635c3e6888bdede502da01a0ec612837899f183\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c66494f6d-72shk" podUID="f87305a5-4340-4e54-9029-d480976de92f" May 27 17:05:28.379588 systemd[1]: Created slice kubepods-besteffort-pod6a4311e4_69eb_4fe5_a407_eaf19f301066.slice - libcontainer container kubepods-besteffort-pod6a4311e4_69eb_4fe5_a407_eaf19f301066.slice. May 27 17:05:28.382954 containerd[1514]: time="2025-05-27T17:05:28.382920482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sx6tt,Uid:6a4311e4-69eb-4fe5-a407-eaf19f301066,Namespace:calico-system,Attempt:0,}" May 27 17:05:28.441899 containerd[1514]: time="2025-05-27T17:05:28.441800589Z" level=error msg="Failed to destroy network for sandbox \"ac1a5626fb0955ae43bf04413a745dfa7ac1f4e128c5a7c7c9e4693c4320edfb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:28.444381 containerd[1514]: time="2025-05-27T17:05:28.443880189Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sx6tt,Uid:6a4311e4-69eb-4fe5-a407-eaf19f301066,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac1a5626fb0955ae43bf04413a745dfa7ac1f4e128c5a7c7c9e4693c4320edfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:28.445356 kubelet[2795]: E0527 17:05:28.444765 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac1a5626fb0955ae43bf04413a745dfa7ac1f4e128c5a7c7c9e4693c4320edfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:05:28.445356 kubelet[2795]: E0527 17:05:28.445451 2795 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac1a5626fb0955ae43bf04413a745dfa7ac1f4e128c5a7c7c9e4693c4320edfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sx6tt" May 27 17:05:28.445356 kubelet[2795]: E0527 17:05:28.445482 2795 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac1a5626fb0955ae43bf04413a745dfa7ac1f4e128c5a7c7c9e4693c4320edfb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sx6tt" May 27 17:05:28.445143 systemd[1]: run-netns-cni\x2d8d3e5f2b\x2d2418\x2d6467\x2d0d2a\x2db65e2297f430.mount: Deactivated successfully. May 27 17:05:28.448087 kubelet[2795]: E0527 17:05:28.447492 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-sx6tt_calico-system(6a4311e4-69eb-4fe5-a407-eaf19f301066)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-sx6tt_calico-system(6a4311e4-69eb-4fe5-a407-eaf19f301066)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac1a5626fb0955ae43bf04413a745dfa7ac1f4e128c5a7c7c9e4693c4320edfb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-sx6tt" podUID="6a4311e4-69eb-4fe5-a407-eaf19f301066" May 27 17:05:28.947415 kubelet[2795]: I0527 17:05:28.947194 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:05:34.496947 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount433856625.mount: Deactivated successfully. May 27 17:05:34.520841 containerd[1514]: time="2025-05-27T17:05:34.520732409Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:34.522597 containerd[1514]: time="2025-05-27T17:05:34.522534449Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=150465379" May 27 17:05:34.523644 containerd[1514]: time="2025-05-27T17:05:34.523572968Z" level=info msg="ImageCreate event name:\"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:34.526975 containerd[1514]: time="2025-05-27T17:05:34.526909328Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:34.527935 containerd[1514]: time="2025-05-27T17:05:34.527877927Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"150465241\" in 6.96586318s" May 27 17:05:34.527935 containerd[1514]: time="2025-05-27T17:05:34.527917327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\"" May 27 17:05:34.550904 containerd[1514]: time="2025-05-27T17:05:34.550829923Z" level=info msg="CreateContainer within sandbox \"66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 17:05:34.564811 containerd[1514]: time="2025-05-27T17:05:34.564756879Z" level=info msg="Container cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:34.576897 containerd[1514]: time="2025-05-27T17:05:34.576834077Z" level=info msg="CreateContainer within sandbox \"66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\"" May 27 17:05:34.580401 containerd[1514]: time="2025-05-27T17:05:34.579548316Z" level=info msg="StartContainer for \"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\"" May 27 17:05:34.581334 containerd[1514]: time="2025-05-27T17:05:34.581199756Z" level=info msg="connecting to shim cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b" address="unix:///run/containerd/s/9b7cb97e0dab7cb7540ae32cd84242f9070527a884fb1cdc8feea45ec78a6a5d" protocol=ttrpc version=3 May 27 17:05:34.640726 systemd[1]: Started cri-containerd-cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b.scope - libcontainer container cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b. May 27 17:05:34.695038 containerd[1514]: time="2025-05-27T17:05:34.694915691Z" level=info msg="StartContainer for \"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" returns successfully" May 27 17:05:34.835588 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 17:05:34.835728 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 17:05:35.087761 kubelet[2795]: I0527 17:05:35.087635 2795 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pntql\" (UniqueName: \"kubernetes.io/projected/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-kube-api-access-pntql\") pod \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\" (UID: \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\") " May 27 17:05:35.087761 kubelet[2795]: I0527 17:05:35.087693 2795 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-backend-key-pair\") pod \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\" (UID: \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\") " May 27 17:05:35.087761 kubelet[2795]: I0527 17:05:35.087748 2795 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-ca-bundle\") pod \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\" (UID: \"2fe1988c-87b9-4d51-b169-ff7ad6d492ef\") " May 27 17:05:35.098880 kubelet[2795]: I0527 17:05:35.097455 2795 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-kube-api-access-pntql" (OuterVolumeSpecName: "kube-api-access-pntql") pod "2fe1988c-87b9-4d51-b169-ff7ad6d492ef" (UID: "2fe1988c-87b9-4d51-b169-ff7ad6d492ef"). InnerVolumeSpecName "kube-api-access-pntql". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 17:05:35.098880 kubelet[2795]: I0527 17:05:35.098806 2795 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "2fe1988c-87b9-4d51-b169-ff7ad6d492ef" (UID: "2fe1988c-87b9-4d51-b169-ff7ad6d492ef"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 17:05:35.101572 kubelet[2795]: I0527 17:05:35.101525 2795 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "2fe1988c-87b9-4d51-b169-ff7ad6d492ef" (UID: "2fe1988c-87b9-4d51-b169-ff7ad6d492ef"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 17:05:35.188872 kubelet[2795]: I0527 17:05:35.188823 2795 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pntql\" (UniqueName: \"kubernetes.io/projected/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-kube-api-access-pntql\") on node \"ci-4344-0-0-0-39ed1690e8\" DevicePath \"\"" May 27 17:05:35.188872 kubelet[2795]: I0527 17:05:35.188869 2795 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-backend-key-pair\") on node \"ci-4344-0-0-0-39ed1690e8\" DevicePath \"\"" May 27 17:05:35.189032 kubelet[2795]: I0527 17:05:35.188884 2795 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fe1988c-87b9-4d51-b169-ff7ad6d492ef-whisker-ca-bundle\") on node \"ci-4344-0-0-0-39ed1690e8\" DevicePath \"\"" May 27 17:05:35.379655 systemd[1]: Removed slice kubepods-besteffort-pod2fe1988c_87b9_4d51_b169_ff7ad6d492ef.slice - libcontainer container kubepods-besteffort-pod2fe1988c_87b9_4d51_b169_ff7ad6d492ef.slice. May 27 17:05:35.497985 systemd[1]: var-lib-kubelet-pods-2fe1988c\x2d87b9\x2d4d51\x2db169\x2dff7ad6d492ef-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dpntql.mount: Deactivated successfully. May 27 17:05:35.498088 systemd[1]: var-lib-kubelet-pods-2fe1988c\x2d87b9\x2d4d51\x2db169\x2dff7ad6d492ef-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 17:05:35.622096 kubelet[2795]: I0527 17:05:35.622001 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xf245" podStartSLOduration=2.055756263 podStartE2EDuration="19.621979091s" podCreationTimestamp="2025-05-27 17:05:16 +0000 UTC" firstStartedPulling="2025-05-27 17:05:16.962554819 +0000 UTC m=+25.736288268" lastFinishedPulling="2025-05-27 17:05:34.528777647 +0000 UTC m=+43.302511096" observedRunningTime="2025-05-27 17:05:35.619424371 +0000 UTC m=+44.393157820" watchObservedRunningTime="2025-05-27 17:05:35.621979091 +0000 UTC m=+44.395712540" May 27 17:05:35.709915 systemd[1]: Created slice kubepods-besteffort-pod432a2b34_eaf5_4f72_a2b6_f15f78b36b83.slice - libcontainer container kubepods-besteffort-pod432a2b34_eaf5_4f72_a2b6_f15f78b36b83.slice. May 27 17:05:35.793391 kubelet[2795]: I0527 17:05:35.793202 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/432a2b34-eaf5-4f72-a2b6-f15f78b36b83-whisker-backend-key-pair\") pod \"whisker-78c6556c4f-rjlzx\" (UID: \"432a2b34-eaf5-4f72-a2b6-f15f78b36b83\") " pod="calico-system/whisker-78c6556c4f-rjlzx" May 27 17:05:35.793391 kubelet[2795]: I0527 17:05:35.793259 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7px9\" (UniqueName: \"kubernetes.io/projected/432a2b34-eaf5-4f72-a2b6-f15f78b36b83-kube-api-access-w7px9\") pod \"whisker-78c6556c4f-rjlzx\" (UID: \"432a2b34-eaf5-4f72-a2b6-f15f78b36b83\") " pod="calico-system/whisker-78c6556c4f-rjlzx" May 27 17:05:35.793391 kubelet[2795]: I0527 17:05:35.793287 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432a2b34-eaf5-4f72-a2b6-f15f78b36b83-whisker-ca-bundle\") pod \"whisker-78c6556c4f-rjlzx\" (UID: \"432a2b34-eaf5-4f72-a2b6-f15f78b36b83\") " pod="calico-system/whisker-78c6556c4f-rjlzx" May 27 17:05:36.015716 containerd[1514]: time="2025-05-27T17:05:36.015567006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c6556c4f-rjlzx,Uid:432a2b34-eaf5-4f72-a2b6-f15f78b36b83,Namespace:calico-system,Attempt:0,}" May 27 17:05:36.232855 systemd-networkd[1427]: calibbaa3799a6f: Link UP May 27 17:05:36.235023 systemd-networkd[1427]: calibbaa3799a6f: Gained carrier May 27 17:05:36.262300 containerd[1514]: 2025-05-27 17:05:36.048 [INFO][3889] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:05:36.262300 containerd[1514]: 2025-05-27 17:05:36.094 [INFO][3889] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0 whisker-78c6556c4f- calico-system 432a2b34-eaf5-4f72-a2b6-f15f78b36b83 916 0 2025-05-27 17:05:35 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78c6556c4f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 whisker-78c6556c4f-rjlzx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calibbaa3799a6f [] [] }} ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-" May 27 17:05:36.262300 containerd[1514]: 2025-05-27 17:05:36.094 [INFO][3889] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" May 27 17:05:36.262300 containerd[1514]: 2025-05-27 17:05:36.150 [INFO][3900] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" HandleID="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Workload="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.150 [INFO][3900] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" HandleID="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Workload="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002a83f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"whisker-78c6556c4f-rjlzx", "timestamp":"2025-05-27 17:05:36.150337617 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.150 [INFO][3900] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.150 [INFO][3900] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.151 [INFO][3900] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.163 [INFO][3900] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.172 [INFO][3900] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.181 [INFO][3900] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.187 [INFO][3900] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.262611 containerd[1514]: 2025-05-27 17:05:36.192 [INFO][3900] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.263218 containerd[1514]: 2025-05-27 17:05:36.192 [INFO][3900] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.263218 containerd[1514]: 2025-05-27 17:05:36.197 [INFO][3900] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c May 27 17:05:36.263218 containerd[1514]: 2025-05-27 17:05:36.207 [INFO][3900] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.263218 containerd[1514]: 2025-05-27 17:05:36.217 [INFO][3900] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.65/26] block=192.168.108.64/26 handle="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.263218 containerd[1514]: 2025-05-27 17:05:36.217 [INFO][3900] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.65/26] handle="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:36.263218 containerd[1514]: 2025-05-27 17:05:36.217 [INFO][3900] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:36.263218 containerd[1514]: 2025-05-27 17:05:36.217 [INFO][3900] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.65/26] IPv6=[] ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" HandleID="k8s-pod-network.62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Workload="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" May 27 17:05:36.263650 containerd[1514]: 2025-05-27 17:05:36.222 [INFO][3889] cni-plugin/k8s.go 418: Populated endpoint ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0", GenerateName:"whisker-78c6556c4f-", Namespace:"calico-system", SelfLink:"", UID:"432a2b34-eaf5-4f72-a2b6-f15f78b36b83", ResourceVersion:"916", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78c6556c4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"whisker-78c6556c4f-rjlzx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.108.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibbaa3799a6f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:36.263650 containerd[1514]: 2025-05-27 17:05:36.222 [INFO][3889] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.65/32] ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" May 27 17:05:36.263872 containerd[1514]: 2025-05-27 17:05:36.222 [INFO][3889] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibbaa3799a6f ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" May 27 17:05:36.263872 containerd[1514]: 2025-05-27 17:05:36.236 [INFO][3889] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" May 27 17:05:36.263964 containerd[1514]: 2025-05-27 17:05:36.238 [INFO][3889] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0", GenerateName:"whisker-78c6556c4f-", Namespace:"calico-system", SelfLink:"", UID:"432a2b34-eaf5-4f72-a2b6-f15f78b36b83", ResourceVersion:"916", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78c6556c4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c", Pod:"whisker-78c6556c4f-rjlzx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.108.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibbaa3799a6f", MAC:"3a:c7:72:77:73:5f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:36.264113 containerd[1514]: 2025-05-27 17:05:36.257 [INFO][3889] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" Namespace="calico-system" Pod="whisker-78c6556c4f-rjlzx" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-whisker--78c6556c4f--rjlzx-eth0" May 27 17:05:36.345796 containerd[1514]: time="2025-05-27T17:05:36.345500895Z" level=info msg="connecting to shim 62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c" address="unix:///run/containerd/s/32f294296eda6b895121d09211d470b7c06a2c872cda2ce63df9717445967b6c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:36.410558 systemd[1]: Started cri-containerd-62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c.scope - libcontainer container 62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c. May 27 17:05:36.485719 containerd[1514]: time="2025-05-27T17:05:36.485670025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78c6556c4f-rjlzx,Uid:432a2b34-eaf5-4f72-a2b6-f15f78b36b83,Namespace:calico-system,Attempt:0,} returns sandbox id \"62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c\"" May 27 17:05:36.491723 containerd[1514]: time="2025-05-27T17:05:36.491671063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:05:36.601353 kubelet[2795]: I0527 17:05:36.601096 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:05:36.715139 containerd[1514]: time="2025-05-27T17:05:36.714979295Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:05:36.716522 containerd[1514]: time="2025-05-27T17:05:36.716354295Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:05:36.716522 containerd[1514]: time="2025-05-27T17:05:36.716490535Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:05:36.720205 kubelet[2795]: E0527 17:05:36.720057 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:05:36.720624 kubelet[2795]: E0527 17:05:36.720431 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:05:36.726203 kubelet[2795]: E0527 17:05:36.726018 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:46cc42382a97413a9376060056363be4,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:05:36.728874 containerd[1514]: time="2025-05-27T17:05:36.728826412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:05:36.950289 containerd[1514]: time="2025-05-27T17:05:36.950151685Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:05:36.952378 containerd[1514]: time="2025-05-27T17:05:36.952311484Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:05:36.952576 containerd[1514]: time="2025-05-27T17:05:36.952350244Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:05:36.952852 kubelet[2795]: E0527 17:05:36.952774 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:05:36.952852 kubelet[2795]: E0527 17:05:36.952833 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:05:36.953440 kubelet[2795]: E0527 17:05:36.953061 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:05:36.954731 kubelet[2795]: E0527 17:05:36.954687 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:05:37.015007 systemd-networkd[1427]: vxlan.calico: Link UP May 27 17:05:37.015021 systemd-networkd[1427]: vxlan.calico: Gained carrier May 27 17:05:37.313851 systemd-networkd[1427]: calibbaa3799a6f: Gained IPv6LL May 27 17:05:37.376393 kubelet[2795]: I0527 17:05:37.376325 2795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe1988c-87b9-4d51-b169-ff7ad6d492ef" path="/var/lib/kubelet/pods/2fe1988c-87b9-4d51-b169-ff7ad6d492ef/volumes" May 27 17:05:37.607038 kubelet[2795]: I0527 17:05:37.606813 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:05:37.611237 kubelet[2795]: E0527 17:05:37.611181 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:05:37.752079 containerd[1514]: time="2025-05-27T17:05:37.752038953Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"06fdfb6b8ffad45818c69f9f67d73efd98d06c17071db367919f45d0ba806b05\" pid:4166 exit_status:1 exited_at:{seconds:1748365537 nanos:751338473}" May 27 17:05:37.836754 containerd[1514]: time="2025-05-27T17:05:37.836708335Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"89a85ecfbfe1cd254bdc679b93613117e6702d4acf87c8f3237d098f04bc2af4\" pid:4189 exit_status:1 exited_at:{seconds:1748365537 nanos:836191695}" May 27 17:05:38.371607 containerd[1514]: time="2025-05-27T17:05:38.371547421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f784fdc78-tq7f7,Uid:ea0a16a8-4944-4683-af93-caab94c7fa78,Namespace:calico-apiserver,Attempt:0,}" May 27 17:05:38.403316 systemd-networkd[1427]: vxlan.calico: Gained IPv6LL May 27 17:05:38.523554 systemd-networkd[1427]: caliad93e7cd672: Link UP May 27 17:05:38.525031 systemd-networkd[1427]: caliad93e7cd672: Gained carrier May 27 17:05:38.545180 containerd[1514]: 2025-05-27 17:05:38.425 [INFO][4202] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0 calico-apiserver-5f784fdc78- calico-apiserver ea0a16a8-4944-4683-af93-caab94c7fa78 843 0 2025-05-27 17:05:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f784fdc78 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 calico-apiserver-5f784fdc78-tq7f7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliad93e7cd672 [] [] }} ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-" May 27 17:05:38.545180 containerd[1514]: 2025-05-27 17:05:38.426 [INFO][4202] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" May 27 17:05:38.545180 containerd[1514]: 2025-05-27 17:05:38.452 [INFO][4214] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" HandleID="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.452 [INFO][4214] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" HandleID="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400022f010), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"calico-apiserver-5f784fdc78-tq7f7", "timestamp":"2025-05-27 17:05:38.452154404 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.452 [INFO][4214] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.452 [INFO][4214] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.452 [INFO][4214] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.469 [INFO][4214] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.481 [INFO][4214] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.489 [INFO][4214] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.492 [INFO][4214] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546139 containerd[1514]: 2025-05-27 17:05:38.495 [INFO][4214] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546344 containerd[1514]: 2025-05-27 17:05:38.495 [INFO][4214] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546344 containerd[1514]: 2025-05-27 17:05:38.498 [INFO][4214] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a May 27 17:05:38.546344 containerd[1514]: 2025-05-27 17:05:38.503 [INFO][4214] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546344 containerd[1514]: 2025-05-27 17:05:38.511 [INFO][4214] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.66/26] block=192.168.108.64/26 handle="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546344 containerd[1514]: 2025-05-27 17:05:38.512 [INFO][4214] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.66/26] handle="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:38.546344 containerd[1514]: 2025-05-27 17:05:38.512 [INFO][4214] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:38.546344 containerd[1514]: 2025-05-27 17:05:38.512 [INFO][4214] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.66/26] IPv6=[] ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" HandleID="k8s-pod-network.55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" May 27 17:05:38.546998 containerd[1514]: 2025-05-27 17:05:38.515 [INFO][4202] cni-plugin/k8s.go 418: Populated endpoint ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0", GenerateName:"calico-apiserver-5f784fdc78-", Namespace:"calico-apiserver", SelfLink:"", UID:"ea0a16a8-4944-4683-af93-caab94c7fa78", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f784fdc78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"calico-apiserver-5f784fdc78-tq7f7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliad93e7cd672", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:38.547567 containerd[1514]: 2025-05-27 17:05:38.516 [INFO][4202] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.66/32] ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" May 27 17:05:38.547567 containerd[1514]: 2025-05-27 17:05:38.516 [INFO][4202] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliad93e7cd672 ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" May 27 17:05:38.547567 containerd[1514]: 2025-05-27 17:05:38.524 [INFO][4202] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" May 27 17:05:38.547650 containerd[1514]: 2025-05-27 17:05:38.525 [INFO][4202] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0", GenerateName:"calico-apiserver-5f784fdc78-", Namespace:"calico-apiserver", SelfLink:"", UID:"ea0a16a8-4944-4683-af93-caab94c7fa78", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f784fdc78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a", Pod:"calico-apiserver-5f784fdc78-tq7f7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliad93e7cd672", MAC:"3e:2c:ff:89:8e:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:38.547703 containerd[1514]: 2025-05-27 17:05:38.542 [INFO][4202] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tq7f7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tq7f7-eth0" May 27 17:05:38.580397 containerd[1514]: time="2025-05-27T17:05:38.579985497Z" level=info msg="connecting to shim 55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a" address="unix:///run/containerd/s/a2ecb6a5bc99cfbb11a744fa88b15b47d9db451e11b6083923278b65d42b8024" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:38.623797 systemd[1]: Started cri-containerd-55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a.scope - libcontainer container 55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a. May 27 17:05:38.745209 containerd[1514]: time="2025-05-27T17:05:38.745167941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f784fdc78-tq7f7,Uid:ea0a16a8-4944-4683-af93-caab94c7fa78,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a\"" May 27 17:05:38.749533 containerd[1514]: time="2025-05-27T17:05:38.749441380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:05:39.372717 containerd[1514]: time="2025-05-27T17:05:39.372657808Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gsfg7,Uid:05f26e54-5384-4d57-8899-41e4fa10bcb5,Namespace:kube-system,Attempt:0,}" May 27 17:05:39.373893 containerd[1514]: time="2025-05-27T17:05:39.372759488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65b8755655-qrgpn,Uid:58b197a4-a9f4-4e6c-b4d2-415b559fda41,Namespace:calico-system,Attempt:0,}" May 27 17:05:39.374860 containerd[1514]: time="2025-05-27T17:05:39.374719888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sx6tt,Uid:6a4311e4-69eb-4fe5-a407-eaf19f301066,Namespace:calico-system,Attempt:0,}" May 27 17:05:39.593741 systemd-networkd[1427]: cali399ea6b371a: Link UP May 27 17:05:39.595073 systemd-networkd[1427]: cali399ea6b371a: Gained carrier May 27 17:05:39.623785 containerd[1514]: 2025-05-27 17:05:39.455 [INFO][4277] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0 calico-kube-controllers-65b8755655- calico-system 58b197a4-a9f4-4e6c-b4d2-415b559fda41 844 0 2025-05-27 17:05:16 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:65b8755655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 calico-kube-controllers-65b8755655-qrgpn eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali399ea6b371a [] [] }} ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-" May 27 17:05:39.623785 containerd[1514]: 2025-05-27 17:05:39.456 [INFO][4277] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" May 27 17:05:39.623785 containerd[1514]: 2025-05-27 17:05:39.514 [INFO][4314] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" HandleID="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.518 [INFO][4314] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" HandleID="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400022f680), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"calico-kube-controllers-65b8755655-qrgpn", "timestamp":"2025-05-27 17:05:39.514007698 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.518 [INFO][4314] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.518 [INFO][4314] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.518 [INFO][4314] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.538 [INFO][4314] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.548 [INFO][4314] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.557 [INFO][4314] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.560 [INFO][4314] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624009 containerd[1514]: 2025-05-27 17:05:39.563 [INFO][4314] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624204 containerd[1514]: 2025-05-27 17:05:39.563 [INFO][4314] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624204 containerd[1514]: 2025-05-27 17:05:39.566 [INFO][4314] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc May 27 17:05:39.624204 containerd[1514]: 2025-05-27 17:05:39.573 [INFO][4314] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624204 containerd[1514]: 2025-05-27 17:05:39.582 [INFO][4314] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.67/26] block=192.168.108.64/26 handle="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624204 containerd[1514]: 2025-05-27 17:05:39.582 [INFO][4314] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.67/26] handle="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.624204 containerd[1514]: 2025-05-27 17:05:39.582 [INFO][4314] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:39.624204 containerd[1514]: 2025-05-27 17:05:39.582 [INFO][4314] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.67/26] IPv6=[] ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" HandleID="k8s-pod-network.fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" May 27 17:05:39.624342 containerd[1514]: 2025-05-27 17:05:39.587 [INFO][4277] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0", GenerateName:"calico-kube-controllers-65b8755655-", Namespace:"calico-system", SelfLink:"", UID:"58b197a4-a9f4-4e6c-b4d2-415b559fda41", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65b8755655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"calico-kube-controllers-65b8755655-qrgpn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.108.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali399ea6b371a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:39.624426 containerd[1514]: 2025-05-27 17:05:39.587 [INFO][4277] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.67/32] ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" May 27 17:05:39.624426 containerd[1514]: 2025-05-27 17:05:39.588 [INFO][4277] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali399ea6b371a ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" May 27 17:05:39.624426 containerd[1514]: 2025-05-27 17:05:39.595 [INFO][4277] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" May 27 17:05:39.624518 containerd[1514]: 2025-05-27 17:05:39.598 [INFO][4277] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0", GenerateName:"calico-kube-controllers-65b8755655-", Namespace:"calico-system", SelfLink:"", UID:"58b197a4-a9f4-4e6c-b4d2-415b559fda41", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65b8755655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc", Pod:"calico-kube-controllers-65b8755655-qrgpn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.108.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali399ea6b371a", MAC:"1a:35:82:ce:70:38", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:39.624570 containerd[1514]: 2025-05-27 17:05:39.618 [INFO][4277] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" Namespace="calico-system" Pod="calico-kube-controllers-65b8755655-qrgpn" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--kube--controllers--65b8755655--qrgpn-eth0" May 27 17:05:39.681971 containerd[1514]: time="2025-05-27T17:05:39.681924062Z" level=info msg="connecting to shim fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc" address="unix:///run/containerd/s/605bb2cf6f6833940c1c875ad2d37501fd44e0fc972a9b6d901278aa8c5e3b4c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:39.736425 systemd-networkd[1427]: cali35b32aad246: Link UP May 27 17:05:39.738735 systemd-networkd[1427]: cali35b32aad246: Gained carrier May 27 17:05:39.743628 systemd[1]: Started cri-containerd-fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc.scope - libcontainer container fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc. May 27 17:05:39.765007 containerd[1514]: 2025-05-27 17:05:39.482 [INFO][4281] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0 coredns-674b8bbfcf- kube-system 05f26e54-5384-4d57-8899-41e4fa10bcb5 841 0 2025-05-27 17:04:58 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 coredns-674b8bbfcf-gsfg7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali35b32aad246 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-" May 27 17:05:39.765007 containerd[1514]: 2025-05-27 17:05:39.482 [INFO][4281] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" May 27 17:05:39.765007 containerd[1514]: 2025-05-27 17:05:39.534 [INFO][4321] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" HandleID="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Workload="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.534 [INFO][4321] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" HandleID="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Workload="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d7630), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"coredns-674b8bbfcf-gsfg7", "timestamp":"2025-05-27 17:05:39.534770854 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.535 [INFO][4321] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.583 [INFO][4321] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.583 [INFO][4321] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.639 [INFO][4321] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.649 [INFO][4321] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.662 [INFO][4321] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.666 [INFO][4321] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766552 containerd[1514]: 2025-05-27 17:05:39.675 [INFO][4321] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766952 containerd[1514]: 2025-05-27 17:05:39.675 [INFO][4321] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766952 containerd[1514]: 2025-05-27 17:05:39.681 [INFO][4321] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c May 27 17:05:39.766952 containerd[1514]: 2025-05-27 17:05:39.699 [INFO][4321] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766952 containerd[1514]: 2025-05-27 17:05:39.712 [INFO][4321] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.68/26] block=192.168.108.64/26 handle="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766952 containerd[1514]: 2025-05-27 17:05:39.712 [INFO][4321] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.68/26] handle="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.766952 containerd[1514]: 2025-05-27 17:05:39.712 [INFO][4321] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:39.766952 containerd[1514]: 2025-05-27 17:05:39.713 [INFO][4321] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.68/26] IPv6=[] ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" HandleID="k8s-pod-network.32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Workload="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" May 27 17:05:39.767086 containerd[1514]: 2025-05-27 17:05:39.720 [INFO][4281] cni-plugin/k8s.go 418: Populated endpoint ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"05f26e54-5384-4d57-8899-41e4fa10bcb5", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 4, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"coredns-674b8bbfcf-gsfg7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.108.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali35b32aad246", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:39.767086 containerd[1514]: 2025-05-27 17:05:39.720 [INFO][4281] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.68/32] ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" May 27 17:05:39.767086 containerd[1514]: 2025-05-27 17:05:39.720 [INFO][4281] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali35b32aad246 ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" May 27 17:05:39.767086 containerd[1514]: 2025-05-27 17:05:39.740 [INFO][4281] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" May 27 17:05:39.767086 containerd[1514]: 2025-05-27 17:05:39.741 [INFO][4281] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"05f26e54-5384-4d57-8899-41e4fa10bcb5", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 4, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c", Pod:"coredns-674b8bbfcf-gsfg7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.108.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali35b32aad246", MAC:"8e:7a:d5:b5:83:f3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:39.767086 containerd[1514]: 2025-05-27 17:05:39.763 [INFO][4281] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" Namespace="kube-system" Pod="coredns-674b8bbfcf-gsfg7" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gsfg7-eth0" May 27 17:05:39.806317 containerd[1514]: time="2025-05-27T17:05:39.806264996Z" level=info msg="connecting to shim 32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c" address="unix:///run/containerd/s/01f4ad83e86bc2ec75d1f70d7d67ddcf6c421346287a8da0d8a913e38a403988" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:39.825204 systemd-networkd[1427]: califb63e0e20ed: Link UP May 27 17:05:39.830322 systemd-networkd[1427]: califb63e0e20ed: Gained carrier May 27 17:05:39.862213 systemd[1]: Started cri-containerd-32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c.scope - libcontainer container 32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c. May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.488 [INFO][4295] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0 csi-node-driver- calico-system 6a4311e4-69eb-4fe5-a407-eaf19f301066 710 0 2025-05-27 17:05:16 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 csi-node-driver-sx6tt eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] califb63e0e20ed [] [] }} ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.488 [INFO][4295] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.550 [INFO][4326] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" HandleID="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Workload="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.550 [INFO][4326] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" HandleID="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Workload="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c5020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"csi-node-driver-sx6tt", "timestamp":"2025-05-27 17:05:39.55029297 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.551 [INFO][4326] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.713 [INFO][4326] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.713 [INFO][4326] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.743 [INFO][4326] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.760 [INFO][4326] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.772 [INFO][4326] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.776 [INFO][4326] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.780 [INFO][4326] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.780 [INFO][4326] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.784 [INFO][4326] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.795 [INFO][4326] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.810 [INFO][4326] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.69/26] block=192.168.108.64/26 handle="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.810 [INFO][4326] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.69/26] handle="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.811 [INFO][4326] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:39.868054 containerd[1514]: 2025-05-27 17:05:39.811 [INFO][4326] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.69/26] IPv6=[] ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" HandleID="k8s-pod-network.becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Workload="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" May 27 17:05:39.869803 containerd[1514]: 2025-05-27 17:05:39.817 [INFO][4295] cni-plugin/k8s.go 418: Populated endpoint ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6a4311e4-69eb-4fe5-a407-eaf19f301066", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"csi-node-driver-sx6tt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.108.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califb63e0e20ed", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:39.869803 containerd[1514]: 2025-05-27 17:05:39.817 [INFO][4295] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.69/32] ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" May 27 17:05:39.869803 containerd[1514]: 2025-05-27 17:05:39.817 [INFO][4295] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califb63e0e20ed ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" May 27 17:05:39.869803 containerd[1514]: 2025-05-27 17:05:39.838 [INFO][4295] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" May 27 17:05:39.869803 containerd[1514]: 2025-05-27 17:05:39.839 [INFO][4295] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6a4311e4-69eb-4fe5-a407-eaf19f301066", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e", Pod:"csi-node-driver-sx6tt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.108.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califb63e0e20ed", MAC:"0e:37:31:ea:cc:1c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:39.869803 containerd[1514]: 2025-05-27 17:05:39.859 [INFO][4295] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" Namespace="calico-system" Pod="csi-node-driver-sx6tt" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-csi--node--driver--sx6tt-eth0" May 27 17:05:39.907783 containerd[1514]: time="2025-05-27T17:05:39.907030695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65b8755655-qrgpn,Uid:58b197a4-a9f4-4e6c-b4d2-415b559fda41,Namespace:calico-system,Attempt:0,} returns sandbox id \"fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc\"" May 27 17:05:39.924398 containerd[1514]: time="2025-05-27T17:05:39.923942851Z" level=info msg="connecting to shim becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e" address="unix:///run/containerd/s/a57bc0442eb34101f399260daa537eeba327f6921f02372db676a2bfdb06a5af" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:39.951397 containerd[1514]: time="2025-05-27T17:05:39.949683606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gsfg7,Uid:05f26e54-5384-4d57-8899-41e4fa10bcb5,Namespace:kube-system,Attempt:0,} returns sandbox id \"32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c\"" May 27 17:05:39.962037 containerd[1514]: time="2025-05-27T17:05:39.961977443Z" level=info msg="CreateContainer within sandbox \"32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:05:39.971590 systemd[1]: Started cri-containerd-becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e.scope - libcontainer container becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e. May 27 17:05:39.981758 containerd[1514]: time="2025-05-27T17:05:39.981699039Z" level=info msg="Container 66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:40.014342 containerd[1514]: time="2025-05-27T17:05:40.014262552Z" level=info msg="CreateContainer within sandbox \"32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442\"" May 27 17:05:40.018217 containerd[1514]: time="2025-05-27T17:05:40.018177431Z" level=info msg="StartContainer for \"66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442\"" May 27 17:05:40.023100 containerd[1514]: time="2025-05-27T17:05:40.023045670Z" level=info msg="connecting to shim 66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442" address="unix:///run/containerd/s/01f4ad83e86bc2ec75d1f70d7d67ddcf6c421346287a8da0d8a913e38a403988" protocol=ttrpc version=3 May 27 17:05:40.036624 containerd[1514]: time="2025-05-27T17:05:40.036575667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sx6tt,Uid:6a4311e4-69eb-4fe5-a407-eaf19f301066,Namespace:calico-system,Attempt:0,} returns sandbox id \"becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e\"" May 27 17:05:40.055669 systemd[1]: Started cri-containerd-66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442.scope - libcontainer container 66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442. May 27 17:05:40.092357 containerd[1514]: time="2025-05-27T17:05:40.092314575Z" level=info msg="StartContainer for \"66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442\" returns successfully" May 27 17:05:40.371761 containerd[1514]: time="2025-05-27T17:05:40.371649556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gv5xq,Uid:b52118ae-5b8b-4a0f-8f30-dd827acc27ac,Namespace:kube-system,Attempt:0,}" May 27 17:05:40.372316 containerd[1514]: time="2025-05-27T17:05:40.371719596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-gc4dc,Uid:2f1591b1-a33b-436b-97d2-a3ba5959f3d8,Namespace:calico-apiserver,Attempt:0,}" May 27 17:05:40.513596 systemd-networkd[1427]: caliad93e7cd672: Gained IPv6LL May 27 17:05:40.561625 systemd-networkd[1427]: cali536512e8580: Link UP May 27 17:05:40.561832 systemd-networkd[1427]: cali536512e8580: Gained carrier May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.443 [INFO][4535] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0 calico-apiserver-6c66494f6d- calico-apiserver 2f1591b1-a33b-436b-97d2-a3ba5959f3d8 845 0 2025-05-27 17:05:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c66494f6d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 calico-apiserver-6c66494f6d-gc4dc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali536512e8580 [] [] }} ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.444 [INFO][4535] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.485 [INFO][4558] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.485 [INFO][4558] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400022f240), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"calico-apiserver-6c66494f6d-gc4dc", "timestamp":"2025-05-27 17:05:40.485445692 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.485 [INFO][4558] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.486 [INFO][4558] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.486 [INFO][4558] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.500 [INFO][4558] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.510 [INFO][4558] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.520 [INFO][4558] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.522 [INFO][4558] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.526 [INFO][4558] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.526 [INFO][4558] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.530 [INFO][4558] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.537 [INFO][4558] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.549 [INFO][4558] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.70/26] block=192.168.108.64/26 handle="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.549 [INFO][4558] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.70/26] handle="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.549 [INFO][4558] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:40.584800 containerd[1514]: 2025-05-27 17:05:40.549 [INFO][4558] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.70/26] IPv6=[] ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:40.585945 containerd[1514]: 2025-05-27 17:05:40.554 [INFO][4535] cni-plugin/k8s.go 418: Populated endpoint ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0", GenerateName:"calico-apiserver-6c66494f6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"2f1591b1-a33b-436b-97d2-a3ba5959f3d8", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c66494f6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"calico-apiserver-6c66494f6d-gc4dc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali536512e8580", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:40.585945 containerd[1514]: 2025-05-27 17:05:40.554 [INFO][4535] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.70/32] ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:40.585945 containerd[1514]: 2025-05-27 17:05:40.554 [INFO][4535] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali536512e8580 ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:40.585945 containerd[1514]: 2025-05-27 17:05:40.562 [INFO][4535] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:40.585945 containerd[1514]: 2025-05-27 17:05:40.563 [INFO][4535] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0", GenerateName:"calico-apiserver-6c66494f6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"2f1591b1-a33b-436b-97d2-a3ba5959f3d8", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c66494f6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d", Pod:"calico-apiserver-6c66494f6d-gc4dc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali536512e8580", MAC:"ae:7c:3b:9a:bf:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:40.585945 containerd[1514]: 2025-05-27 17:05:40.583 [INFO][4535] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-gc4dc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:40.632015 containerd[1514]: time="2025-05-27T17:05:40.631891742Z" level=info msg="connecting to shim 37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" address="unix:///run/containerd/s/ec7575854fc711e3ec577923a1ec42e75398a048e86190f2fe612835d3fc1cad" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:40.678928 kubelet[2795]: I0527 17:05:40.678851 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-gsfg7" podStartSLOduration=42.678831972 podStartE2EDuration="42.678831972s" podCreationTimestamp="2025-05-27 17:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:05:40.677087532 +0000 UTC m=+49.450821021" watchObservedRunningTime="2025-05-27 17:05:40.678831972 +0000 UTC m=+49.452565421" May 27 17:05:40.695288 systemd-networkd[1427]: cali2c29565e1d3: Link UP May 27 17:05:40.697500 systemd-networkd[1427]: cali2c29565e1d3: Gained carrier May 27 17:05:40.708614 systemd[1]: Started cri-containerd-37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d.scope - libcontainer container 37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d. May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.447 [INFO][4533] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0 coredns-674b8bbfcf- kube-system b52118ae-5b8b-4a0f-8f30-dd827acc27ac 846 0 2025-05-27 17:04:58 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 coredns-674b8bbfcf-gv5xq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2c29565e1d3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.447 [INFO][4533] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.500 [INFO][4560] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" HandleID="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Workload="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.500 [INFO][4560] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" HandleID="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Workload="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400037fee0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"coredns-674b8bbfcf-gv5xq", "timestamp":"2025-05-27 17:05:40.500497369 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.500 [INFO][4560] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.549 [INFO][4560] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.549 [INFO][4560] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.601 [INFO][4560] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.621 [INFO][4560] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.638 [INFO][4560] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.641 [INFO][4560] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.651 [INFO][4560] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.652 [INFO][4560] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.657 [INFO][4560] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448 May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.666 [INFO][4560] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.681 [INFO][4560] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.71/26] block=192.168.108.64/26 handle="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.682 [INFO][4560] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.71/26] handle="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.682 [INFO][4560] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:40.729629 containerd[1514]: 2025-05-27 17:05:40.682 [INFO][4560] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.71/26] IPv6=[] ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" HandleID="k8s-pod-network.0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Workload="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" May 27 17:05:40.730169 containerd[1514]: 2025-05-27 17:05:40.687 [INFO][4533] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"b52118ae-5b8b-4a0f-8f30-dd827acc27ac", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 4, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"coredns-674b8bbfcf-gv5xq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.108.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2c29565e1d3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:40.730169 containerd[1514]: 2025-05-27 17:05:40.687 [INFO][4533] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.71/32] ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" May 27 17:05:40.730169 containerd[1514]: 2025-05-27 17:05:40.687 [INFO][4533] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2c29565e1d3 ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" May 27 17:05:40.730169 containerd[1514]: 2025-05-27 17:05:40.699 [INFO][4533] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" May 27 17:05:40.730169 containerd[1514]: 2025-05-27 17:05:40.703 [INFO][4533] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"b52118ae-5b8b-4a0f-8f30-dd827acc27ac", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 4, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448", Pod:"coredns-674b8bbfcf-gv5xq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.108.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2c29565e1d3", MAC:"a2:ec:4e:49:58:1b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:40.730169 containerd[1514]: 2025-05-27 17:05:40.726 [INFO][4533] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" Namespace="kube-system" Pod="coredns-674b8bbfcf-gv5xq" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-coredns--674b8bbfcf--gv5xq-eth0" May 27 17:05:40.762171 containerd[1514]: time="2025-05-27T17:05:40.761738954Z" level=info msg="connecting to shim 0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448" address="unix:///run/containerd/s/459fd1bd66da72b52a5901b1f58733448fe30c332ecb8a7dd41ac178fa9f4d2c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:40.796601 systemd[1]: Started cri-containerd-0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448.scope - libcontainer container 0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448. May 27 17:05:40.850220 containerd[1514]: time="2025-05-27T17:05:40.850177775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-gc4dc,Uid:2f1591b1-a33b-436b-97d2-a3ba5959f3d8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\"" May 27 17:05:40.865535 containerd[1514]: time="2025-05-27T17:05:40.865446372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gv5xq,Uid:b52118ae-5b8b-4a0f-8f30-dd827acc27ac,Namespace:kube-system,Attempt:0,} returns sandbox id \"0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448\"" May 27 17:05:40.872319 containerd[1514]: time="2025-05-27T17:05:40.871967891Z" level=info msg="CreateContainer within sandbox \"0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:05:40.887126 containerd[1514]: time="2025-05-27T17:05:40.887026888Z" level=info msg="Container 29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:40.896422 containerd[1514]: time="2025-05-27T17:05:40.896349526Z" level=info msg="CreateContainer within sandbox \"0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c\"" May 27 17:05:40.898217 containerd[1514]: time="2025-05-27T17:05:40.898162925Z" level=info msg="StartContainer for \"29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c\"" May 27 17:05:40.900778 containerd[1514]: time="2025-05-27T17:05:40.900712725Z" level=info msg="connecting to shim 29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c" address="unix:///run/containerd/s/459fd1bd66da72b52a5901b1f58733448fe30c332ecb8a7dd41ac178fa9f4d2c" protocol=ttrpc version=3 May 27 17:05:40.921625 systemd[1]: Started cri-containerd-29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c.scope - libcontainer container 29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c. May 27 17:05:40.956700 containerd[1514]: time="2025-05-27T17:05:40.956651113Z" level=info msg="StartContainer for \"29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c\" returns successfully" May 27 17:05:41.090586 systemd-networkd[1427]: cali35b32aad246: Gained IPv6LL May 27 17:05:41.282277 systemd-networkd[1427]: cali399ea6b371a: Gained IPv6LL May 27 17:05:41.346232 systemd-networkd[1427]: califb63e0e20ed: Gained IPv6LL May 27 17:05:41.373447 containerd[1514]: time="2025-05-27T17:05:41.373390305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-72shk,Uid:f87305a5-4340-4e54-9029-d480976de92f,Namespace:calico-apiserver,Attempt:0,}" May 27 17:05:41.592422 systemd-networkd[1427]: cali447812e9c91: Link UP May 27 17:05:41.595638 systemd-networkd[1427]: cali447812e9c91: Gained carrier May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.424 [INFO][4719] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0 calico-apiserver-6c66494f6d- calico-apiserver f87305a5-4340-4e54-9029-d480976de92f 840 0 2025-05-27 17:05:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c66494f6d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 calico-apiserver-6c66494f6d-72shk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali447812e9c91 [] [] }} ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.424 [INFO][4719] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.478 [INFO][4731] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.478 [INFO][4731] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400022f290), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"calico-apiserver-6c66494f6d-72shk", "timestamp":"2025-05-27 17:05:41.478391243 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.478 [INFO][4731] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.478 [INFO][4731] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.478 [INFO][4731] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.502 [INFO][4731] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.519 [INFO][4731] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.529 [INFO][4731] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.534 [INFO][4731] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.540 [INFO][4731] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.540 [INFO][4731] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.543 [INFO][4731] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.551 [INFO][4731] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.579 [INFO][4731] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.72/26] block=192.168.108.64/26 handle="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.579 [INFO][4731] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.72/26] handle="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.579 [INFO][4731] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:41.631416 containerd[1514]: 2025-05-27 17:05:41.579 [INFO][4731] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.72/26] IPv6=[] ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:05:41.632186 containerd[1514]: 2025-05-27 17:05:41.584 [INFO][4719] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0", GenerateName:"calico-apiserver-6c66494f6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"f87305a5-4340-4e54-9029-d480976de92f", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c66494f6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"calico-apiserver-6c66494f6d-72shk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali447812e9c91", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:41.632186 containerd[1514]: 2025-05-27 17:05:41.584 [INFO][4719] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.72/32] ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:05:41.632186 containerd[1514]: 2025-05-27 17:05:41.584 [INFO][4719] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali447812e9c91 ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:05:41.632186 containerd[1514]: 2025-05-27 17:05:41.600 [INFO][4719] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:05:41.632186 containerd[1514]: 2025-05-27 17:05:41.603 [INFO][4719] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0", GenerateName:"calico-apiserver-6c66494f6d-", Namespace:"calico-apiserver", SelfLink:"", UID:"f87305a5-4340-4e54-9029-d480976de92f", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c66494f6d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa", Pod:"calico-apiserver-6c66494f6d-72shk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali447812e9c91", MAC:"9e:c0:26:80:71:a1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:41.632186 containerd[1514]: 2025-05-27 17:05:41.622 [INFO][4719] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Namespace="calico-apiserver" Pod="calico-apiserver-6c66494f6d-72shk" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:05:41.691413 containerd[1514]: time="2025-05-27T17:05:41.691346718Z" level=info msg="connecting to shim 0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" address="unix:///run/containerd/s/948439090525e03963f407129c6f66ea01062fe2186381f0b8f784b0c50458de" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:41.703291 kubelet[2795]: I0527 17:05:41.703043 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-gv5xq" podStartSLOduration=43.703023756 podStartE2EDuration="43.703023756s" podCreationTimestamp="2025-05-27 17:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:05:41.701983876 +0000 UTC m=+50.475717325" watchObservedRunningTime="2025-05-27 17:05:41.703023756 +0000 UTC m=+50.476757205" May 27 17:05:41.762701 systemd[1]: Started cri-containerd-0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa.scope - libcontainer container 0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa. May 27 17:05:41.903804 containerd[1514]: time="2025-05-27T17:05:41.903630674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c66494f6d-72shk,Uid:f87305a5-4340-4e54-9029-d480976de92f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\"" May 27 17:05:42.178156 systemd-networkd[1427]: cali2c29565e1d3: Gained IPv6LL May 27 17:05:42.305563 systemd-networkd[1427]: cali536512e8580: Gained IPv6LL May 27 17:05:42.381228 containerd[1514]: time="2025-05-27T17:05:42.380875725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-lflfc,Uid:45d42809-0456-4761-94f0-815274f2dcfd,Namespace:calico-system,Attempt:0,}" May 27 17:05:42.649234 containerd[1514]: time="2025-05-27T17:05:42.649176886Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:42.651422 containerd[1514]: time="2025-05-27T17:05:42.651357653Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=44453213" May 27 17:05:42.653565 containerd[1514]: time="2025-05-27T17:05:42.653518301Z" level=info msg="ImageCreate event name:\"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:42.657186 containerd[1514]: time="2025-05-27T17:05:42.657069193Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:42.657814 containerd[1514]: time="2025-05-27T17:05:42.657781275Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 3.908268295s" May 27 17:05:42.657814 containerd[1514]: time="2025-05-27T17:05:42.657815675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 17:05:42.661781 containerd[1514]: time="2025-05-27T17:05:42.661701769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 17:05:42.665982 containerd[1514]: time="2025-05-27T17:05:42.665013340Z" level=info msg="CreateContainer within sandbox \"55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:05:42.696347 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1613454105.mount: Deactivated successfully. May 27 17:05:42.698838 containerd[1514]: time="2025-05-27T17:05:42.698656215Z" level=info msg="Container 111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:42.717736 containerd[1514]: time="2025-05-27T17:05:42.717406880Z" level=info msg="CreateContainer within sandbox \"55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb\"" May 27 17:05:42.718969 containerd[1514]: time="2025-05-27T17:05:42.718930125Z" level=info msg="StartContainer for \"111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb\"" May 27 17:05:42.722055 containerd[1514]: time="2025-05-27T17:05:42.721986775Z" level=info msg="connecting to shim 111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb" address="unix:///run/containerd/s/a2ecb6a5bc99cfbb11a744fa88b15b47d9db451e11b6083923278b65d42b8024" protocol=ttrpc version=3 May 27 17:05:42.764995 systemd-networkd[1427]: cali63ffe118a67: Link UP May 27 17:05:42.770906 systemd[1]: Started cri-containerd-111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb.scope - libcontainer container 111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb. May 27 17:05:42.776329 systemd-networkd[1427]: cali63ffe118a67: Gained carrier May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.494 [INFO][4812] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0 goldmane-78d55f7ddc- calico-system 45d42809-0456-4761-94f0-815274f2dcfd 842 0 2025-05-27 17:05:17 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 goldmane-78d55f7ddc-lflfc eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali63ffe118a67 [] [] }} ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.495 [INFO][4812] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.611 [INFO][4826] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" HandleID="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Workload="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.611 [INFO][4826] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" HandleID="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Workload="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004df00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"goldmane-78d55f7ddc-lflfc", "timestamp":"2025-05-27 17:05:42.611720477 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.612 [INFO][4826] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.612 [INFO][4826] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.612 [INFO][4826] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.639 [INFO][4826] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.649 [INFO][4826] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.668 [INFO][4826] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.673 [INFO][4826] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.680 [INFO][4826] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.681 [INFO][4826] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.689 [INFO][4826] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89 May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.714 [INFO][4826] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.746 [INFO][4826] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.73/26] block=192.168.108.64/26 handle="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.746 [INFO][4826] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.73/26] handle="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.746 [INFO][4826] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:42.806988 containerd[1514]: 2025-05-27 17:05:42.746 [INFO][4826] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.73/26] IPv6=[] ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" HandleID="k8s-pod-network.c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Workload="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" May 27 17:05:42.808785 containerd[1514]: 2025-05-27 17:05:42.751 [INFO][4812] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"45d42809-0456-4761-94f0-815274f2dcfd", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"goldmane-78d55f7ddc-lflfc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.108.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali63ffe118a67", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:42.808785 containerd[1514]: 2025-05-27 17:05:42.752 [INFO][4812] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.73/32] ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" May 27 17:05:42.808785 containerd[1514]: 2025-05-27 17:05:42.752 [INFO][4812] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali63ffe118a67 ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" May 27 17:05:42.808785 containerd[1514]: 2025-05-27 17:05:42.782 [INFO][4812] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" May 27 17:05:42.808785 containerd[1514]: 2025-05-27 17:05:42.784 [INFO][4812] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"45d42809-0456-4761-94f0-815274f2dcfd", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89", Pod:"goldmane-78d55f7ddc-lflfc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.108.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali63ffe118a67", MAC:"7a:77:6e:d6:01:37", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:42.808785 containerd[1514]: 2025-05-27 17:05:42.802 [INFO][4812] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" Namespace="calico-system" Pod="goldmane-78d55f7ddc-lflfc" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-goldmane--78d55f7ddc--lflfc-eth0" May 27 17:05:42.851116 containerd[1514]: time="2025-05-27T17:05:42.851045538Z" level=info msg="connecting to shim c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89" address="unix:///run/containerd/s/790c0176a61ff810ad5050ec5c56ed750f681bcd4e4167bd00fde7c30fbecc88" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:42.893729 containerd[1514]: time="2025-05-27T17:05:42.891648678Z" level=info msg="StartContainer for \"111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb\" returns successfully" May 27 17:05:42.908901 systemd[1]: Started cri-containerd-c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89.scope - libcontainer container c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89. May 27 17:05:43.013325 containerd[1514]: time="2025-05-27T17:05:43.013276191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-lflfc,Uid:45d42809-0456-4761-94f0-815274f2dcfd,Namespace:calico-system,Attempt:0,} returns sandbox id \"c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89\"" May 27 17:05:43.201771 systemd-networkd[1427]: cali447812e9c91: Gained IPv6LL May 27 17:05:43.737672 kubelet[2795]: I0527 17:05:43.737433 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5f784fdc78-tq7f7" podStartSLOduration=26.82491256 podStartE2EDuration="30.737416421s" podCreationTimestamp="2025-05-27 17:05:13 +0000 UTC" firstStartedPulling="2025-05-27 17:05:38.747169741 +0000 UTC m=+47.520903190" lastFinishedPulling="2025-05-27 17:05:42.659673602 +0000 UTC m=+51.433407051" observedRunningTime="2025-05-27 17:05:43.736979166 +0000 UTC m=+52.510712655" watchObservedRunningTime="2025-05-27 17:05:43.737416421 +0000 UTC m=+52.511149870" May 27 17:05:44.225574 systemd-networkd[1427]: cali63ffe118a67: Gained IPv6LL May 27 17:05:44.713913 kubelet[2795]: I0527 17:05:44.713807 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:05:47.347293 containerd[1514]: time="2025-05-27T17:05:47.346778682Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:47.348933 containerd[1514]: time="2025-05-27T17:05:47.348897468Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=48045219" May 27 17:05:47.350703 containerd[1514]: time="2025-05-27T17:05:47.350671882Z" level=info msg="ImageCreate event name:\"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:47.353276 containerd[1514]: time="2025-05-27T17:05:47.353217601Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:47.354205 containerd[1514]: time="2025-05-27T17:05:47.354171470Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"49414428\" in 4.692426621s" May 27 17:05:47.354275 containerd[1514]: time="2025-05-27T17:05:47.354207431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\"" May 27 17:05:47.357741 containerd[1514]: time="2025-05-27T17:05:47.357695059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 17:05:47.373927 containerd[1514]: time="2025-05-27T17:05:47.373888957Z" level=info msg="CreateContainer within sandbox \"fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 17:05:47.383389 containerd[1514]: time="2025-05-27T17:05:47.382526623Z" level=info msg="Container 6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:47.405328 containerd[1514]: time="2025-05-27T17:05:47.405266564Z" level=info msg="CreateContainer within sandbox \"fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\"" May 27 17:05:47.407111 containerd[1514]: time="2025-05-27T17:05:47.406064348Z" level=info msg="StartContainer for \"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\"" May 27 17:05:47.407419 containerd[1514]: time="2025-05-27T17:05:47.407350788Z" level=info msg="connecting to shim 6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a" address="unix:///run/containerd/s/605bb2cf6f6833940c1c875ad2d37501fd44e0fc972a9b6d901278aa8c5e3b4c" protocol=ttrpc version=3 May 27 17:05:47.430621 systemd[1]: Started cri-containerd-6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a.scope - libcontainer container 6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a. May 27 17:05:47.485409 containerd[1514]: time="2025-05-27T17:05:47.485295389Z" level=info msg="StartContainer for \"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" returns successfully" May 27 17:05:47.787443 containerd[1514]: time="2025-05-27T17:05:47.787153925Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"9b2756c03b0629e5434cd791ab6a12632481b2072836220dcfbb38b4dd0a0467\" pid:4992 exited_at:{seconds:1748365547 nanos:779992185}" May 27 17:05:47.829158 kubelet[2795]: I0527 17:05:47.828709 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-65b8755655-qrgpn" podStartSLOduration=24.374781004 podStartE2EDuration="31.820030498s" podCreationTimestamp="2025-05-27 17:05:16 +0000 UTC" firstStartedPulling="2025-05-27 17:05:39.910152734 +0000 UTC m=+48.683886183" lastFinishedPulling="2025-05-27 17:05:47.355402228 +0000 UTC m=+56.129135677" observedRunningTime="2025-05-27 17:05:47.751424465 +0000 UTC m=+56.525157914" watchObservedRunningTime="2025-05-27 17:05:47.820030498 +0000 UTC m=+56.593763907" May 27 17:05:48.972254 containerd[1514]: time="2025-05-27T17:05:48.971690592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:48.973250 containerd[1514]: time="2025-05-27T17:05:48.973217839Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8226240" May 27 17:05:48.973931 containerd[1514]: time="2025-05-27T17:05:48.973901859Z" level=info msg="ImageCreate event name:\"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:48.977022 containerd[1514]: time="2025-05-27T17:05:48.976958232Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:48.977540 containerd[1514]: time="2025-05-27T17:05:48.977510409Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"9595481\" in 1.619768309s" May 27 17:05:48.977611 containerd[1514]: time="2025-05-27T17:05:48.977549490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\"" May 27 17:05:48.980352 containerd[1514]: time="2025-05-27T17:05:48.980277133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:05:48.987246 containerd[1514]: time="2025-05-27T17:05:48.987205583Z" level=info msg="CreateContainer within sandbox \"becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 17:05:49.019838 containerd[1514]: time="2025-05-27T17:05:49.017025202Z" level=info msg="Container 006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:49.021810 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount883702521.mount: Deactivated successfully. May 27 17:05:49.041565 containerd[1514]: time="2025-05-27T17:05:49.041207206Z" level=info msg="CreateContainer within sandbox \"becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072\"" May 27 17:05:49.043167 containerd[1514]: time="2025-05-27T17:05:49.042970859Z" level=info msg="StartContainer for \"006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072\"" May 27 17:05:49.047816 containerd[1514]: time="2025-05-27T17:05:49.047708681Z" level=info msg="connecting to shim 006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072" address="unix:///run/containerd/s/a57bc0442eb34101f399260daa537eeba327f6921f02372db676a2bfdb06a5af" protocol=ttrpc version=3 May 27 17:05:49.071650 systemd[1]: Started cri-containerd-006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072.scope - libcontainer container 006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072. May 27 17:05:49.153421 kubelet[2795]: I0527 17:05:49.153308 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:05:49.164399 containerd[1514]: time="2025-05-27T17:05:49.164335054Z" level=info msg="StartContainer for \"006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072\" returns successfully" May 27 17:05:49.315513 systemd[1]: Created slice kubepods-besteffort-podc1937d37_9228_41a5_ae38_e286e2e97c48.slice - libcontainer container kubepods-besteffort-podc1937d37_9228_41a5_ae38_e286e2e97c48.slice. May 27 17:05:49.375183 containerd[1514]: time="2025-05-27T17:05:49.375141008Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:49.376940 containerd[1514]: time="2025-05-27T17:05:49.376882060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 17:05:49.379537 containerd[1514]: time="2025-05-27T17:05:49.379492258Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 399.179604ms" May 27 17:05:49.379802 containerd[1514]: time="2025-05-27T17:05:49.379690504Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 17:05:49.381549 containerd[1514]: time="2025-05-27T17:05:49.381472038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:05:49.386077 containerd[1514]: time="2025-05-27T17:05:49.386039734Z" level=info msg="CreateContainer within sandbox \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:05:49.402372 containerd[1514]: time="2025-05-27T17:05:49.399018723Z" level=info msg="Container 695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:49.412721 containerd[1514]: time="2025-05-27T17:05:49.412671932Z" level=info msg="CreateContainer within sandbox \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\"" May 27 17:05:49.413571 containerd[1514]: time="2025-05-27T17:05:49.413542958Z" level=info msg="StartContainer for \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\"" May 27 17:05:49.415053 kubelet[2795]: I0527 17:05:49.415004 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c1937d37-9228-41a5-ae38-e286e2e97c48-calico-apiserver-certs\") pod \"calico-apiserver-5f784fdc78-tldmp\" (UID: \"c1937d37-9228-41a5-ae38-e286e2e97c48\") " pod="calico-apiserver/calico-apiserver-5f784fdc78-tldmp" May 27 17:05:49.415053 kubelet[2795]: I0527 17:05:49.415055 2795 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lb8k\" (UniqueName: \"kubernetes.io/projected/c1937d37-9228-41a5-ae38-e286e2e97c48-kube-api-access-6lb8k\") pod \"calico-apiserver-5f784fdc78-tldmp\" (UID: \"c1937d37-9228-41a5-ae38-e286e2e97c48\") " pod="calico-apiserver/calico-apiserver-5f784fdc78-tldmp" May 27 17:05:49.417049 containerd[1514]: time="2025-05-27T17:05:49.416330002Z" level=info msg="connecting to shim 695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba" address="unix:///run/containerd/s/ec7575854fc711e3ec577923a1ec42e75398a048e86190f2fe612835d3fc1cad" protocol=ttrpc version=3 May 27 17:05:49.464562 systemd[1]: Started cri-containerd-695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba.scope - libcontainer container 695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba. May 27 17:05:49.533257 containerd[1514]: time="2025-05-27T17:05:49.533167061Z" level=info msg="StartContainer for \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" returns successfully" May 27 17:05:49.623467 containerd[1514]: time="2025-05-27T17:05:49.623062113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f784fdc78-tldmp,Uid:c1937d37-9228-41a5-ae38-e286e2e97c48,Namespace:calico-apiserver,Attempt:0,}" May 27 17:05:49.744996 containerd[1514]: time="2025-05-27T17:05:49.744785959Z" level=info msg="StopContainer for \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" with timeout 30 (s)" May 27 17:05:49.746833 containerd[1514]: time="2025-05-27T17:05:49.746040717Z" level=info msg="Stop container \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" with signal terminated" May 27 17:05:49.763824 kubelet[2795]: I0527 17:05:49.763514 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c66494f6d-gc4dc" podStartSLOduration=30.235798475 podStartE2EDuration="38.763492279s" podCreationTimestamp="2025-05-27 17:05:11 +0000 UTC" firstStartedPulling="2025-05-27 17:05:40.853159935 +0000 UTC m=+49.626893384" lastFinishedPulling="2025-05-27 17:05:49.380853779 +0000 UTC m=+58.154587188" observedRunningTime="2025-05-27 17:05:49.761145209 +0000 UTC m=+58.534878658" watchObservedRunningTime="2025-05-27 17:05:49.763492279 +0000 UTC m=+58.537225728" May 27 17:05:49.783611 systemd[1]: cri-containerd-695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba.scope: Deactivated successfully. May 27 17:05:49.792706 containerd[1514]: time="2025-05-27T17:05:49.792661073Z" level=info msg="received exit event container_id:\"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" id:\"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" pid:5054 exit_status:1 exited_at:{seconds:1748365549 nanos:792026814}" May 27 17:05:49.794172 containerd[1514]: time="2025-05-27T17:05:49.794096236Z" level=info msg="TaskExit event in podsandbox handler container_id:\"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" id:\"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" pid:5054 exit_status:1 exited_at:{seconds:1748365549 nanos:792026814}" May 27 17:05:49.807341 containerd[1514]: time="2025-05-27T17:05:49.807244110Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:49.809806 containerd[1514]: time="2025-05-27T17:05:49.808710834Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 17:05:49.816382 containerd[1514]: time="2025-05-27T17:05:49.816319982Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 434.771462ms" May 27 17:05:49.816382 containerd[1514]: time="2025-05-27T17:05:49.816384703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 17:05:49.821304 containerd[1514]: time="2025-05-27T17:05:49.821160766Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:05:49.828057 containerd[1514]: time="2025-05-27T17:05:49.827954770Z" level=info msg="CreateContainer within sandbox \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:05:49.836659 systemd-networkd[1427]: cali067952100d5: Link UP May 27 17:05:49.839021 systemd-networkd[1427]: cali067952100d5: Gained carrier May 27 17:05:49.862610 containerd[1514]: time="2025-05-27T17:05:49.862564087Z" level=info msg="Container 1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.675 [INFO][5080] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0 calico-apiserver-5f784fdc78- calico-apiserver c1937d37-9228-41a5-ae38-e286e2e97c48 1055 0 2025-05-27 17:05:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f784fdc78 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-0-0-0-39ed1690e8 calico-apiserver-5f784fdc78-tldmp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali067952100d5 [] [] }} ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.675 [INFO][5080] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.711 [INFO][5091] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" HandleID="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.711 [INFO][5091] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" HandleID="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002a8540), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-0-0-0-39ed1690e8", "pod":"calico-apiserver-5f784fdc78-tldmp", "timestamp":"2025-05-27 17:05:49.711157472 +0000 UTC"}, Hostname:"ci-4344-0-0-0-39ed1690e8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.711 [INFO][5091] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.711 [INFO][5091] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.711 [INFO][5091] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-0-39ed1690e8' May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.728 [INFO][5091] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.740 [INFO][5091] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.760 [INFO][5091] ipam/ipam.go 511: Trying affinity for 192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.772 [INFO][5091] ipam/ipam.go 158: Attempting to load block cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.779 [INFO][5091] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.108.64/26 host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.779 [INFO][5091] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.108.64/26 handle="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.782 [INFO][5091] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11 May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.797 [INFO][5091] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.108.64/26 handle="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.823 [INFO][5091] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.108.74/26] block=192.168.108.64/26 handle="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.823 [INFO][5091] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.108.74/26] handle="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" host="ci-4344-0-0-0-39ed1690e8" May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.823 [INFO][5091] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:49.867298 containerd[1514]: 2025-05-27 17:05:49.823 [INFO][5091] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.108.74/26] IPv6=[] ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" HandleID="k8s-pod-network.5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" May 27 17:05:49.869444 containerd[1514]: 2025-05-27 17:05:49.832 [INFO][5080] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0", GenerateName:"calico-apiserver-5f784fdc78-", Namespace:"calico-apiserver", SelfLink:"", UID:"c1937d37-9228-41a5-ae38-e286e2e97c48", ResourceVersion:"1055", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f784fdc78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"", Pod:"calico-apiserver-5f784fdc78-tldmp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali067952100d5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:49.869444 containerd[1514]: 2025-05-27 17:05:49.832 [INFO][5080] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.108.74/32] ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" May 27 17:05:49.869444 containerd[1514]: 2025-05-27 17:05:49.832 [INFO][5080] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali067952100d5 ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" May 27 17:05:49.869444 containerd[1514]: 2025-05-27 17:05:49.838 [INFO][5080] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" May 27 17:05:49.869444 containerd[1514]: 2025-05-27 17:05:49.839 [INFO][5080] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0", GenerateName:"calico-apiserver-5f784fdc78-", Namespace:"calico-apiserver", SelfLink:"", UID:"c1937d37-9228-41a5-ae38-e286e2e97c48", ResourceVersion:"1055", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 5, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f784fdc78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-0-39ed1690e8", ContainerID:"5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11", Pod:"calico-apiserver-5f784fdc78-tldmp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.108.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali067952100d5", MAC:"12:08:01:9b:be:55", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:05:49.869444 containerd[1514]: 2025-05-27 17:05:49.861 [INFO][5080] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" Namespace="calico-apiserver" Pod="calico-apiserver-5f784fdc78-tldmp" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--5f784fdc78--tldmp-eth0" May 27 17:05:49.919240 containerd[1514]: time="2025-05-27T17:05:49.919068379Z" level=info msg="CreateContainer within sandbox \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\"" May 27 17:05:49.924714 containerd[1514]: time="2025-05-27T17:05:49.924657386Z" level=info msg="StartContainer for \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\"" May 27 17:05:49.927873 containerd[1514]: time="2025-05-27T17:05:49.927766079Z" level=info msg="connecting to shim 1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122" address="unix:///run/containerd/s/948439090525e03963f407129c6f66ea01062fe2186381f0b8f784b0c50458de" protocol=ttrpc version=3 May 27 17:05:49.956617 systemd[1]: Started cri-containerd-1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122.scope - libcontainer container 1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122. May 27 17:05:50.045604 containerd[1514]: time="2025-05-27T17:05:50.045443185Z" level=info msg="StopContainer for \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" returns successfully" May 27 17:05:50.047297 containerd[1514]: time="2025-05-27T17:05:50.047208677Z" level=info msg="StopPodSandbox for \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\"" May 27 17:05:50.047297 containerd[1514]: time="2025-05-27T17:05:50.047307680Z" level=info msg="Container to stop \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 27 17:05:50.050412 containerd[1514]: time="2025-05-27T17:05:50.048181946Z" level=info msg="StartContainer for \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" returns successfully" May 27 17:05:50.065717 containerd[1514]: time="2025-05-27T17:05:50.065655902Z" level=info msg="connecting to shim 5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11" address="unix:///run/containerd/s/398927837c9cc053d228dc5d3b4b878169cb078150d9d8e6af264939aac36c4a" namespace=k8s.io protocol=ttrpc version=3 May 27 17:05:50.073276 containerd[1514]: time="2025-05-27T17:05:50.073232366Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:05:50.075396 containerd[1514]: time="2025-05-27T17:05:50.075326228Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:05:50.075710 containerd[1514]: time="2025-05-27T17:05:50.075684718Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:05:50.077154 kubelet[2795]: E0527 17:05:50.076436 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:05:50.077154 kubelet[2795]: E0527 17:05:50.076502 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:05:50.077154 kubelet[2795]: E0527 17:05:50.076724 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mqjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-lflfc_calico-system(45d42809-0456-4761-94f0-815274f2dcfd): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:05:50.078052 containerd[1514]: time="2025-05-27T17:05:50.077814181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:05:50.078133 kubelet[2795]: E0527 17:05:50.077912 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:05:50.101694 systemd[1]: Started cri-containerd-5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11.scope - libcontainer container 5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11. May 27 17:05:50.105781 systemd[1]: cri-containerd-37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d.scope: Deactivated successfully. May 27 17:05:50.111978 systemd[1]: cri-containerd-37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d.scope: Consumed 31ms CPU time, 4.4M memory peak, 1.5M read from disk. May 27 17:05:50.114376 containerd[1514]: time="2025-05-27T17:05:50.114320780Z" level=info msg="TaskExit event in podsandbox handler container_id:\"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" id:\"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" pid:4617 exit_status:137 exited_at:{seconds:1748365550 nanos:112529567}" May 27 17:05:50.182406 containerd[1514]: time="2025-05-27T17:05:50.182190464Z" level=info msg="shim disconnected" id=37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d namespace=k8s.io May 27 17:05:50.184069 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d-rootfs.mount: Deactivated successfully. May 27 17:05:50.186201 containerd[1514]: time="2025-05-27T17:05:50.184890544Z" level=warning msg="cleaning up after shim disconnected" id=37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d namespace=k8s.io May 27 17:05:50.186201 containerd[1514]: time="2025-05-27T17:05:50.185899094Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 27 17:05:50.189596 containerd[1514]: time="2025-05-27T17:05:50.189477760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f784fdc78-tldmp,Uid:c1937d37-9228-41a5-ae38-e286e2e97c48,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11\"" May 27 17:05:50.202674 containerd[1514]: time="2025-05-27T17:05:50.202446703Z" level=info msg="CreateContainer within sandbox \"5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:05:50.236257 containerd[1514]: time="2025-05-27T17:05:50.236062616Z" level=info msg="Container 7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:50.279746 containerd[1514]: time="2025-05-27T17:05:50.279614062Z" level=info msg="CreateContainer within sandbox \"5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632\"" May 27 17:05:50.281892 containerd[1514]: time="2025-05-27T17:05:50.281684043Z" level=info msg="StartContainer for \"7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632\"" May 27 17:05:50.284381 containerd[1514]: time="2025-05-27T17:05:50.283238969Z" level=info msg="connecting to shim 7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632" address="unix:///run/containerd/s/398927837c9cc053d228dc5d3b4b878169cb078150d9d8e6af264939aac36c4a" protocol=ttrpc version=3 May 27 17:05:50.291178 containerd[1514]: time="2025-05-27T17:05:50.290955757Z" level=info msg="received exit event sandbox_id:\"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" exit_status:137 exited_at:{seconds:1748365550 nanos:112529567}" May 27 17:05:50.354626 systemd[1]: Started cri-containerd-7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632.scope - libcontainer container 7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632. May 27 17:05:50.408692 systemd-networkd[1427]: cali536512e8580: Link DOWN May 27 17:05:50.408699 systemd-networkd[1427]: cali536512e8580: Lost carrier May 27 17:05:50.412395 containerd[1514]: time="2025-05-27T17:05:50.412174457Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:05:50.415068 containerd[1514]: time="2025-05-27T17:05:50.414955979Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:05:50.415296 containerd[1514]: time="2025-05-27T17:05:50.415189746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:05:50.415943 kubelet[2795]: E0527 17:05:50.415883 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:05:50.417353 kubelet[2795]: E0527 17:05:50.416331 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:05:50.417353 kubelet[2795]: E0527 17:05:50.417047 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:46cc42382a97413a9376060056363be4,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:05:50.417519 containerd[1514]: time="2025-05-27T17:05:50.416770353Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 17:05:50.485524 containerd[1514]: time="2025-05-27T17:05:50.484663238Z" level=info msg="StartContainer for \"7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632\" returns successfully" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.404 [INFO][5260] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.404 [INFO][5260] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" iface="eth0" netns="/var/run/netns/cni-28c0dc88-27b7-8e32-a284-93e98c2183d7" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.405 [INFO][5260] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" iface="eth0" netns="/var/run/netns/cni-28c0dc88-27b7-8e32-a284-93e98c2183d7" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.415 [INFO][5260] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" after=10.858521ms iface="eth0" netns="/var/run/netns/cni-28c0dc88-27b7-8e32-a284-93e98c2183d7" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.415 [INFO][5260] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.415 [INFO][5260] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.473 [INFO][5290] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.474 [INFO][5290] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.474 [INFO][5290] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.539 [INFO][5290] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.540 [INFO][5290] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.544 [INFO][5290] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:50.551463 containerd[1514]: 2025-05-27 17:05:50.547 [INFO][5260] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:50.552253 containerd[1514]: time="2025-05-27T17:05:50.552157832Z" level=info msg="TearDown network for sandbox \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" successfully" May 27 17:05:50.552253 containerd[1514]: time="2025-05-27T17:05:50.552198273Z" level=info msg="StopPodSandbox for \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" returns successfully" May 27 17:05:50.624683 kubelet[2795]: I0527 17:05:50.624592 2795 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlnvw\" (UniqueName: \"kubernetes.io/projected/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-kube-api-access-qlnvw\") pod \"2f1591b1-a33b-436b-97d2-a3ba5959f3d8\" (UID: \"2f1591b1-a33b-436b-97d2-a3ba5959f3d8\") " May 27 17:05:50.624683 kubelet[2795]: I0527 17:05:50.624660 2795 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-calico-apiserver-certs\") pod \"2f1591b1-a33b-436b-97d2-a3ba5959f3d8\" (UID: \"2f1591b1-a33b-436b-97d2-a3ba5959f3d8\") " May 27 17:05:50.630474 kubelet[2795]: I0527 17:05:50.630419 2795 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "2f1591b1-a33b-436b-97d2-a3ba5959f3d8" (UID: "2f1591b1-a33b-436b-97d2-a3ba5959f3d8"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 17:05:50.630474 kubelet[2795]: I0527 17:05:50.630420 2795 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-kube-api-access-qlnvw" (OuterVolumeSpecName: "kube-api-access-qlnvw") pod "2f1591b1-a33b-436b-97d2-a3ba5959f3d8" (UID: "2f1591b1-a33b-436b-97d2-a3ba5959f3d8"). InnerVolumeSpecName "kube-api-access-qlnvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 17:05:50.726041 kubelet[2795]: I0527 17:05:50.725972 2795 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qlnvw\" (UniqueName: \"kubernetes.io/projected/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-kube-api-access-qlnvw\") on node \"ci-4344-0-0-0-39ed1690e8\" DevicePath \"\"" May 27 17:05:50.726041 kubelet[2795]: I0527 17:05:50.726014 2795 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2f1591b1-a33b-436b-97d2-a3ba5959f3d8-calico-apiserver-certs\") on node \"ci-4344-0-0-0-39ed1690e8\" DevicePath \"\"" May 27 17:05:50.789073 kubelet[2795]: E0527 17:05:50.788075 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:05:50.796888 systemd[1]: Removed slice kubepods-besteffort-pod2f1591b1_a33b_436b_97d2_a3ba5959f3d8.slice - libcontainer container kubepods-besteffort-pod2f1591b1_a33b_436b_97d2_a3ba5959f3d8.slice. May 27 17:05:50.797014 systemd[1]: kubepods-besteffort-pod2f1591b1_a33b_436b_97d2_a3ba5959f3d8.slice: Consumed 233ms CPU time, 14.7M memory peak, 1.5M read from disk. May 27 17:05:50.805890 kubelet[2795]: I0527 17:05:50.805630 2795 scope.go:117] "RemoveContainer" containerID="695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba" May 27 17:05:50.811518 containerd[1514]: time="2025-05-27T17:05:50.811478131Z" level=info msg="RemoveContainer for \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\"" May 27 17:05:50.824542 kubelet[2795]: I0527 17:05:50.823409 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5f784fdc78-tldmp" podStartSLOduration=1.8233900429999998 podStartE2EDuration="1.823390043s" podCreationTimestamp="2025-05-27 17:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:05:50.793417678 +0000 UTC m=+59.567151127" watchObservedRunningTime="2025-05-27 17:05:50.823390043 +0000 UTC m=+59.597123492" May 27 17:05:50.842612 containerd[1514]: time="2025-05-27T17:05:50.842574690Z" level=info msg="RemoveContainer for \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" returns successfully" May 27 17:05:50.847209 kubelet[2795]: I0527 17:05:50.847174 2795 scope.go:117] "RemoveContainer" containerID="695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba" May 27 17:05:50.847891 containerd[1514]: time="2025-05-27T17:05:50.847849126Z" level=error msg="ContainerStatus for \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\": not found" May 27 17:05:50.848379 kubelet[2795]: E0527 17:05:50.848292 2795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\": not found" containerID="695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba" May 27 17:05:50.848379 kubelet[2795]: I0527 17:05:50.848337 2795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba"} err="failed to get container status \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\": rpc error: code = NotFound desc = an error occurred when try to find container \"695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba\": not found" May 27 17:05:50.853738 kubelet[2795]: I0527 17:05:50.853670 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c66494f6d-72shk" podStartSLOduration=31.942071584 podStartE2EDuration="39.853637857s" podCreationTimestamp="2025-05-27 17:05:11 +0000 UTC" firstStartedPulling="2025-05-27 17:05:41.907562033 +0000 UTC m=+50.681295442" lastFinishedPulling="2025-05-27 17:05:49.819128266 +0000 UTC m=+58.592861715" observedRunningTime="2025-05-27 17:05:50.826025641 +0000 UTC m=+59.599759090" watchObservedRunningTime="2025-05-27 17:05:50.853637857 +0000 UTC m=+59.627371306" May 27 17:05:51.022523 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d-shm.mount: Deactivated successfully. May 27 17:05:51.022646 systemd[1]: run-netns-cni\x2d28c0dc88\x2d27b7\x2d8e32\x2da284\x2d93e98c2183d7.mount: Deactivated successfully. May 27 17:05:51.022722 systemd[1]: var-lib-kubelet-pods-2f1591b1\x2da33b\x2d436b\x2d97d2\x2da3ba5959f3d8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqlnvw.mount: Deactivated successfully. May 27 17:05:51.022788 systemd[1]: var-lib-kubelet-pods-2f1591b1\x2da33b\x2d436b\x2d97d2\x2da3ba5959f3d8-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 27 17:05:51.381444 kubelet[2795]: I0527 17:05:51.380546 2795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1591b1-a33b-436b-97d2-a3ba5959f3d8" path="/var/lib/kubelet/pods/2f1591b1-a33b-436b-97d2-a3ba5959f3d8/volumes" May 27 17:05:51.383567 containerd[1514]: time="2025-05-27T17:05:51.383493231Z" level=info msg="StopPodSandbox for \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\"" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.489 [WARNING][5328] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.489 [INFO][5328] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.489 [INFO][5328] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" iface="eth0" netns="" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.489 [INFO][5328] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.489 [INFO][5328] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.537 [INFO][5335] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.537 [INFO][5335] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.537 [INFO][5335] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.552 [WARNING][5335] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.553 [INFO][5335] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.555 [INFO][5335] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:51.562005 containerd[1514]: 2025-05-27 17:05:51.559 [INFO][5328] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.563107 containerd[1514]: time="2025-05-27T17:05:51.562654930Z" level=info msg="TearDown network for sandbox \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" successfully" May 27 17:05:51.563107 containerd[1514]: time="2025-05-27T17:05:51.562728612Z" level=info msg="StopPodSandbox for \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" returns successfully" May 27 17:05:51.564401 containerd[1514]: time="2025-05-27T17:05:51.564287497Z" level=info msg="RemovePodSandbox for \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\"" May 27 17:05:51.564401 containerd[1514]: time="2025-05-27T17:05:51.564330298Z" level=info msg="Forcibly stopping sandbox \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\"" May 27 17:05:51.649504 systemd-networkd[1427]: cali067952100d5: Gained IPv6LL May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.623 [WARNING][5349] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.623 [INFO][5349] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.623 [INFO][5349] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" iface="eth0" netns="" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.623 [INFO][5349] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.623 [INFO][5349] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.658 [INFO][5357] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.658 [INFO][5357] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.659 [INFO][5357] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.670 [WARNING][5357] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.670 [INFO][5357] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" HandleID="k8s-pod-network.37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--gc4dc-eth0" May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.674 [INFO][5357] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:05:51.692061 containerd[1514]: 2025-05-27 17:05:51.685 [INFO][5349] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d" May 27 17:05:51.693839 containerd[1514]: time="2025-05-27T17:05:51.693426779Z" level=info msg="TearDown network for sandbox \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" successfully" May 27 17:05:51.698286 containerd[1514]: time="2025-05-27T17:05:51.698210878Z" level=info msg="Ensure that sandbox 37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d in task-service has been cleanup successfully" May 27 17:05:51.702453 containerd[1514]: time="2025-05-27T17:05:51.702334518Z" level=info msg="RemovePodSandbox \"37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d\" returns successfully" May 27 17:05:51.780219 kubelet[2795]: I0527 17:05:51.779924 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:05:52.424763 containerd[1514]: time="2025-05-27T17:05:52.424186815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:52.428241 containerd[1514]: time="2025-05-27T17:05:52.428192370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=13749925" May 27 17:05:52.432322 containerd[1514]: time="2025-05-27T17:05:52.432250927Z" level=info msg="ImageCreate event name:\"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:52.449400 containerd[1514]: time="2025-05-27T17:05:52.448971727Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:05:52.449996 containerd[1514]: time="2025-05-27T17:05:52.449655507Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"15119118\" in 2.032851393s" May 27 17:05:52.449996 containerd[1514]: time="2025-05-27T17:05:52.449733029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\"" May 27 17:05:52.452514 containerd[1514]: time="2025-05-27T17:05:52.452471068Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:05:52.460441 containerd[1514]: time="2025-05-27T17:05:52.460239171Z" level=info msg="CreateContainer within sandbox \"becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 17:05:52.476604 containerd[1514]: time="2025-05-27T17:05:52.476559160Z" level=info msg="Container cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f: CDI devices from CRI Config.CDIDevices: []" May 27 17:05:52.497129 containerd[1514]: time="2025-05-27T17:05:52.497009067Z" level=info msg="CreateContainer within sandbox \"becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f\"" May 27 17:05:52.498392 containerd[1514]: time="2025-05-27T17:05:52.497844851Z" level=info msg="StartContainer for \"cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f\"" May 27 17:05:52.501733 containerd[1514]: time="2025-05-27T17:05:52.501106505Z" level=info msg="connecting to shim cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f" address="unix:///run/containerd/s/a57bc0442eb34101f399260daa537eeba327f6921f02372db676a2bfdb06a5af" protocol=ttrpc version=3 May 27 17:05:52.525560 systemd[1]: Started cri-containerd-cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f.scope - libcontainer container cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f. May 27 17:05:52.626672 containerd[1514]: time="2025-05-27T17:05:52.626628991Z" level=info msg="StartContainer for \"cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f\" returns successfully" May 27 17:05:52.707749 containerd[1514]: time="2025-05-27T17:05:52.707597357Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:05:52.723625 containerd[1514]: time="2025-05-27T17:05:52.723551056Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:05:52.724158 containerd[1514]: time="2025-05-27T17:05:52.724124952Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:05:52.724442 kubelet[2795]: E0527 17:05:52.724327 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:05:52.724621 kubelet[2795]: E0527 17:05:52.724486 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:05:52.726525 kubelet[2795]: E0527 17:05:52.726441 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:05:52.728571 kubelet[2795]: E0527 17:05:52.728494 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:05:52.809028 kubelet[2795]: I0527 17:05:52.807627 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-sx6tt" podStartSLOduration=24.393724074 podStartE2EDuration="36.80761023s" podCreationTimestamp="2025-05-27 17:05:16 +0000 UTC" firstStartedPulling="2025-05-27 17:05:40.038413507 +0000 UTC m=+48.812146956" lastFinishedPulling="2025-05-27 17:05:52.452299623 +0000 UTC m=+61.226033112" observedRunningTime="2025-05-27 17:05:52.806569641 +0000 UTC m=+61.580303130" watchObservedRunningTime="2025-05-27 17:05:52.80761023 +0000 UTC m=+61.581343719" May 27 17:05:53.488219 kubelet[2795]: I0527 17:05:53.488099 2795 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 17:05:53.494068 kubelet[2795]: I0527 17:05:53.494004 2795 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 17:06:04.376055 containerd[1514]: time="2025-05-27T17:06:04.375042802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:06:04.634067 containerd[1514]: time="2025-05-27T17:06:04.633510543Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:06:04.635137 containerd[1514]: time="2025-05-27T17:06:04.634831935Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:06:04.635137 containerd[1514]: time="2025-05-27T17:06:04.634930218Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:06:04.635679 kubelet[2795]: E0527 17:06:04.635156 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:06:04.635679 kubelet[2795]: E0527 17:06:04.635275 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:06:04.637553 kubelet[2795]: E0527 17:06:04.636729 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mqjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-lflfc_calico-system(45d42809-0456-4761-94f0-815274f2dcfd): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:06:04.638273 kubelet[2795]: E0527 17:06:04.638219 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:06:05.376124 kubelet[2795]: E0527 17:06:05.375910 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:06:07.500533 containerd[1514]: time="2025-05-27T17:06:07.500489974Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"f88e1b1f0e8955e9957f836f03848042ffd8041256d413c0998f92a69afc4338\" pid:5433 exited_at:{seconds:1748365567 nanos:499945161}" May 27 17:06:07.848072 containerd[1514]: time="2025-05-27T17:06:07.847978029Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"ce939af839c88cbb103d1e2cacab54cf551ed0064364c3465ed14aeb2eebe0b4\" pid:5455 exited_at:{seconds:1748365567 nanos:847696622}" May 27 17:06:14.014268 kubelet[2795]: I0527 17:06:14.013500 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:06:14.178729 containerd[1514]: time="2025-05-27T17:06:14.178692771Z" level=info msg="StopContainer for \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" with timeout 30 (s)" May 27 17:06:14.179597 containerd[1514]: time="2025-05-27T17:06:14.179564309Z" level=info msg="Stop container \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" with signal terminated" May 27 17:06:14.242800 systemd[1]: cri-containerd-1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122.scope: Deactivated successfully. May 27 17:06:14.243101 systemd[1]: cri-containerd-1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122.scope: Consumed 1.287s CPU time, 44.2M memory peak. May 27 17:06:14.249791 containerd[1514]: time="2025-05-27T17:06:14.249606563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" id:\"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" pid:5146 exit_status:1 exited_at:{seconds:1748365574 nanos:249151793}" May 27 17:06:14.250383 containerd[1514]: time="2025-05-27T17:06:14.250336499Z" level=info msg="received exit event container_id:\"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" id:\"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" pid:5146 exit_status:1 exited_at:{seconds:1748365574 nanos:249151793}" May 27 17:06:14.282303 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122-rootfs.mount: Deactivated successfully. May 27 17:06:14.321516 containerd[1514]: time="2025-05-27T17:06:14.321456616Z" level=info msg="StopContainer for \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" returns successfully" May 27 17:06:14.324893 containerd[1514]: time="2025-05-27T17:06:14.324783567Z" level=info msg="StopPodSandbox for \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\"" May 27 17:06:14.324893 containerd[1514]: time="2025-05-27T17:06:14.324865728Z" level=info msg="Container to stop \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 27 17:06:14.332085 systemd[1]: cri-containerd-0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa.scope: Deactivated successfully. May 27 17:06:14.343045 containerd[1514]: time="2025-05-27T17:06:14.342937234Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" id:\"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" pid:4788 exit_status:137 exited_at:{seconds:1748365574 nanos:342350421}" May 27 17:06:14.373719 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa-rootfs.mount: Deactivated successfully. May 27 17:06:14.377736 containerd[1514]: time="2025-05-27T17:06:14.377598493Z" level=info msg="shim disconnected" id=0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa namespace=k8s.io May 27 17:06:14.377736 containerd[1514]: time="2025-05-27T17:06:14.377640134Z" level=warning msg="cleaning up after shim disconnected" id=0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa namespace=k8s.io May 27 17:06:14.377736 containerd[1514]: time="2025-05-27T17:06:14.377688695Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 27 17:06:14.398479 containerd[1514]: time="2025-05-27T17:06:14.398405297Z" level=info msg="received exit event sandbox_id:\"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" exit_status:137 exited_at:{seconds:1748365574 nanos:342350421}" May 27 17:06:14.403605 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa-shm.mount: Deactivated successfully. May 27 17:06:14.463298 systemd-networkd[1427]: cali447812e9c91: Link DOWN May 27 17:06:14.463305 systemd-networkd[1427]: cali447812e9c91: Lost carrier May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.461 [INFO][5543] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.461 [INFO][5543] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" iface="eth0" netns="/var/run/netns/cni-754e6aa3-3016-de41-8ef1-35270ca70aec" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.461 [INFO][5543] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" iface="eth0" netns="/var/run/netns/cni-754e6aa3-3016-de41-8ef1-35270ca70aec" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.471 [INFO][5543] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" after=10.181897ms iface="eth0" netns="/var/run/netns/cni-754e6aa3-3016-de41-8ef1-35270ca70aec" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.471 [INFO][5543] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.471 [INFO][5543] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.499 [INFO][5552] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.499 [INFO][5552] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.499 [INFO][5552] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.609 [INFO][5552] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.609 [INFO][5552] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.612 [INFO][5552] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:06:14.616813 containerd[1514]: 2025-05-27 17:06:14.615 [INFO][5543] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:14.619319 systemd[1]: run-netns-cni\x2d754e6aa3\x2d3016\x2dde41\x2d8ef1\x2d35270ca70aec.mount: Deactivated successfully. May 27 17:06:14.620230 containerd[1514]: time="2025-05-27T17:06:14.620191387Z" level=info msg="TearDown network for sandbox \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" successfully" May 27 17:06:14.620453 containerd[1514]: time="2025-05-27T17:06:14.620324630Z" level=info msg="StopPodSandbox for \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" returns successfully" May 27 17:06:14.715686 kubelet[2795]: I0527 17:06:14.715635 2795 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f87305a5-4340-4e54-9029-d480976de92f-calico-apiserver-certs\") pod \"f87305a5-4340-4e54-9029-d480976de92f\" (UID: \"f87305a5-4340-4e54-9029-d480976de92f\") " May 27 17:06:14.716339 kubelet[2795]: I0527 17:06:14.715960 2795 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgn2d\" (UniqueName: \"kubernetes.io/projected/f87305a5-4340-4e54-9029-d480976de92f-kube-api-access-bgn2d\") pod \"f87305a5-4340-4e54-9029-d480976de92f\" (UID: \"f87305a5-4340-4e54-9029-d480976de92f\") " May 27 17:06:14.722826 systemd[1]: var-lib-kubelet-pods-f87305a5\x2d4340\x2d4e54\x2d9029\x2dd480976de92f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbgn2d.mount: Deactivated successfully. May 27 17:06:14.726757 kubelet[2795]: I0527 17:06:14.726708 2795 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87305a5-4340-4e54-9029-d480976de92f-kube-api-access-bgn2d" (OuterVolumeSpecName: "kube-api-access-bgn2d") pod "f87305a5-4340-4e54-9029-d480976de92f" (UID: "f87305a5-4340-4e54-9029-d480976de92f"). InnerVolumeSpecName "kube-api-access-bgn2d". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 17:06:14.727529 kubelet[2795]: I0527 17:06:14.727487 2795 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87305a5-4340-4e54-9029-d480976de92f-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "f87305a5-4340-4e54-9029-d480976de92f" (UID: "f87305a5-4340-4e54-9029-d480976de92f"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 17:06:14.817335 kubelet[2795]: I0527 17:06:14.817280 2795 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f87305a5-4340-4e54-9029-d480976de92f-calico-apiserver-certs\") on node \"ci-4344-0-0-0-39ed1690e8\" DevicePath \"\"" May 27 17:06:14.817618 kubelet[2795]: I0527 17:06:14.817587 2795 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bgn2d\" (UniqueName: \"kubernetes.io/projected/f87305a5-4340-4e54-9029-d480976de92f-kube-api-access-bgn2d\") on node \"ci-4344-0-0-0-39ed1690e8\" DevicePath \"\"" May 27 17:06:14.862415 kubelet[2795]: I0527 17:06:14.862097 2795 scope.go:117] "RemoveContainer" containerID="1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122" May 27 17:06:14.869078 systemd[1]: Removed slice kubepods-besteffort-podf87305a5_4340_4e54_9029_d480976de92f.slice - libcontainer container kubepods-besteffort-podf87305a5_4340_4e54_9029_d480976de92f.slice. May 27 17:06:14.869196 systemd[1]: kubepods-besteffort-podf87305a5_4340_4e54_9029_d480976de92f.slice: Consumed 1.309s CPU time, 44.4M memory peak. May 27 17:06:14.870942 containerd[1514]: time="2025-05-27T17:06:14.870421404Z" level=info msg="RemoveContainer for \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\"" May 27 17:06:14.877969 containerd[1514]: time="2025-05-27T17:06:14.877910964Z" level=info msg="RemoveContainer for \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" returns successfully" May 27 17:06:14.879892 kubelet[2795]: I0527 17:06:14.879841 2795 scope.go:117] "RemoveContainer" containerID="1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122" May 27 17:06:14.880897 containerd[1514]: time="2025-05-27T17:06:14.880655862Z" level=error msg="ContainerStatus for \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\": not found" May 27 17:06:14.881314 kubelet[2795]: E0527 17:06:14.881246 2795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\": not found" containerID="1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122" May 27 17:06:14.881314 kubelet[2795]: I0527 17:06:14.881289 2795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122"} err="failed to get container status \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\": rpc error: code = NotFound desc = an error occurred when try to find container \"1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122\": not found" May 27 17:06:15.283553 systemd[1]: var-lib-kubelet-pods-f87305a5\x2d4340\x2d4e54\x2d9029\x2dd480976de92f-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 27 17:06:15.373151 kubelet[2795]: E0527 17:06:15.373085 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:06:15.375977 kubelet[2795]: I0527 17:06:15.375916 2795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87305a5-4340-4e54-9029-d480976de92f" path="/var/lib/kubelet/pods/f87305a5-4340-4e54-9029-d480976de92f/volumes" May 27 17:06:17.943692 containerd[1514]: time="2025-05-27T17:06:17.943565219Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"2d50d97fb300af975d1d0dabc42e198ed156f18b612e4c9c99d711c8b0800ab4\" pid:5588 exited_at:{seconds:1748365577 nanos:942967167}" May 27 17:06:19.375089 containerd[1514]: time="2025-05-27T17:06:19.374678178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:06:19.605671 containerd[1514]: time="2025-05-27T17:06:19.605349066Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:06:19.607229 containerd[1514]: time="2025-05-27T17:06:19.607059140Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:06:19.607229 containerd[1514]: time="2025-05-27T17:06:19.607085021Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:06:19.609489 kubelet[2795]: E0527 17:06:19.609429 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:06:19.610168 kubelet[2795]: E0527 17:06:19.609507 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:06:19.610168 kubelet[2795]: E0527 17:06:19.609632 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:46cc42382a97413a9376060056363be4,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:06:19.612652 containerd[1514]: time="2025-05-27T17:06:19.612604771Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:06:20.175321 containerd[1514]: time="2025-05-27T17:06:20.175068601Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:06:20.176977 containerd[1514]: time="2025-05-27T17:06:20.176667473Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:06:20.176977 containerd[1514]: time="2025-05-27T17:06:20.176701593Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:06:20.177737 kubelet[2795]: E0527 17:06:20.177082 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:06:20.177737 kubelet[2795]: E0527 17:06:20.177126 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:06:20.177737 kubelet[2795]: E0527 17:06:20.177247 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:06:20.178711 kubelet[2795]: E0527 17:06:20.178634 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:06:30.373932 containerd[1514]: time="2025-05-27T17:06:30.373864557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:06:30.623647 containerd[1514]: time="2025-05-27T17:06:30.623591529Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:06:30.625197 containerd[1514]: time="2025-05-27T17:06:30.625045874Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:06:30.625197 containerd[1514]: time="2025-05-27T17:06:30.625139316Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:06:30.625723 kubelet[2795]: E0527 17:06:30.625492 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:06:30.625723 kubelet[2795]: E0527 17:06:30.625557 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:06:30.627888 kubelet[2795]: E0527 17:06:30.627794 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mqjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-lflfc_calico-system(45d42809-0456-4761-94f0-815274f2dcfd): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:06:30.629387 kubelet[2795]: E0527 17:06:30.629007 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:06:34.374149 kubelet[2795]: E0527 17:06:34.373789 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:06:37.886939 containerd[1514]: time="2025-05-27T17:06:37.886850964Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"17107c2df88d33d8e64967ceb99060abc197932afb918e4513e93a8323341000\" pid:5619 exited_at:{seconds:1748365597 nanos:886192913}" May 27 17:06:42.372683 kubelet[2795]: E0527 17:06:42.372614 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:06:46.378278 kubelet[2795]: E0527 17:06:46.378171 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:06:47.771383 containerd[1514]: time="2025-05-27T17:06:47.771021391Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"3106b29868ecdb743424a5366e0a409f789193b3014ca69279e5075ea83f5d86\" pid:5645 exited_at:{seconds:1748365607 nanos:770174699}" May 27 17:06:51.710535 containerd[1514]: time="2025-05-27T17:06:51.710483192Z" level=info msg="StopPodSandbox for \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\"" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.754 [WARNING][5664] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.754 [INFO][5664] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.754 [INFO][5664] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" iface="eth0" netns="" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.754 [INFO][5664] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.754 [INFO][5664] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.789 [INFO][5671] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.789 [INFO][5671] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.789 [INFO][5671] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.799 [WARNING][5671] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.799 [INFO][5671] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.802 [INFO][5671] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:06:51.805675 containerd[1514]: 2025-05-27 17:06:51.803 [INFO][5664] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.806716 containerd[1514]: time="2025-05-27T17:06:51.805733032Z" level=info msg="TearDown network for sandbox \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" successfully" May 27 17:06:51.806716 containerd[1514]: time="2025-05-27T17:06:51.805760353Z" level=info msg="StopPodSandbox for \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" returns successfully" May 27 17:06:51.806716 containerd[1514]: time="2025-05-27T17:06:51.806308800Z" level=info msg="RemovePodSandbox for \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\"" May 27 17:06:51.806716 containerd[1514]: time="2025-05-27T17:06:51.806347921Z" level=info msg="Forcibly stopping sandbox \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\"" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.862 [WARNING][5685] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" WorkloadEndpoint="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.863 [INFO][5685] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.863 [INFO][5685] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" iface="eth0" netns="" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.863 [INFO][5685] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.863 [INFO][5685] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.884 [INFO][5692] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.884 [INFO][5692] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.884 [INFO][5692] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.900 [WARNING][5692] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.900 [INFO][5692] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" HandleID="k8s-pod-network.0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" Workload="ci--4344--0--0--0--39ed1690e8-k8s-calico--apiserver--6c66494f6d--72shk-eth0" May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.903 [INFO][5692] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:06:51.907118 containerd[1514]: 2025-05-27 17:06:51.905 [INFO][5685] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa" May 27 17:06:51.908415 containerd[1514]: time="2025-05-27T17:06:51.907616721Z" level=info msg="TearDown network for sandbox \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" successfully" May 27 17:06:51.909476 containerd[1514]: time="2025-05-27T17:06:51.909444306Z" level=info msg="Ensure that sandbox 0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa in task-service has been cleanup successfully" May 27 17:06:51.913826 containerd[1514]: time="2025-05-27T17:06:51.913784444Z" level=info msg="RemovePodSandbox \"0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa\" returns successfully" May 27 17:06:53.373272 kubelet[2795]: E0527 17:06:53.373187 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:07:01.374911 containerd[1514]: time="2025-05-27T17:07:01.374292963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:07:01.644997 containerd[1514]: time="2025-05-27T17:07:01.644547721Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:07:01.646544 containerd[1514]: time="2025-05-27T17:07:01.646475264Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:07:01.646837 containerd[1514]: time="2025-05-27T17:07:01.646487504Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:07:01.646919 kubelet[2795]: E0527 17:07:01.646838 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:07:01.647773 kubelet[2795]: E0527 17:07:01.646917 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:07:01.647773 kubelet[2795]: E0527 17:07:01.647089 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:46cc42382a97413a9376060056363be4,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:07:01.650135 containerd[1514]: time="2025-05-27T17:07:01.650104028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:07:01.877927 containerd[1514]: time="2025-05-27T17:07:01.877857636Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:07:01.880096 containerd[1514]: time="2025-05-27T17:07:01.880031663Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:07:01.880267 containerd[1514]: time="2025-05-27T17:07:01.880157104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:07:01.880641 kubelet[2795]: E0527 17:07:01.880482 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:07:01.880641 kubelet[2795]: E0527 17:07:01.880555 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:07:01.880901 kubelet[2795]: E0527 17:07:01.880770 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:07:01.882477 kubelet[2795]: E0527 17:07:01.882424 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:07:07.375588 kubelet[2795]: E0527 17:07:07.375506 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:07:07.500218 containerd[1514]: time="2025-05-27T17:07:07.500159184Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"3f0c180beb4450351be81efde749f36e14bed927d5838bdb89c653589065cefd\" pid:5722 exited_at:{seconds:1748365627 nanos:499626818}" May 27 17:07:07.844341 containerd[1514]: time="2025-05-27T17:07:07.844295442Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"e6f9649913a754fb752f8ae736501286d64842cfb77c0e7a8aecd60cbf68972d\" pid:5743 exited_at:{seconds:1748365627 nanos:843589275}" May 27 17:07:13.375726 kubelet[2795]: E0527 17:07:13.375647 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:07:17.766513 containerd[1514]: time="2025-05-27T17:07:17.766424327Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"14003442aadfea52a5af224f19e4189d39f0736de1e8d3ff0eae1603c5ffb655\" pid:5789 exited_at:{seconds:1748365637 nanos:765826841}" May 27 17:07:21.382381 containerd[1514]: time="2025-05-27T17:07:21.382324258Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:07:21.614120 containerd[1514]: time="2025-05-27T17:07:21.613947779Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:07:21.618913 containerd[1514]: time="2025-05-27T17:07:21.618824587Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:07:21.618913 containerd[1514]: time="2025-05-27T17:07:21.618958868Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:07:21.619439 kubelet[2795]: E0527 17:07:21.619258 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:07:21.619439 kubelet[2795]: E0527 17:07:21.619402 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:07:21.630417 kubelet[2795]: E0527 17:07:21.630317 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mqjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-lflfc_calico-system(45d42809-0456-4761-94f0-815274f2dcfd): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:07:21.632003 kubelet[2795]: E0527 17:07:21.631882 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:07:27.375717 kubelet[2795]: E0527 17:07:27.375654 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:07:32.373139 kubelet[2795]: E0527 17:07:32.373059 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:07:37.845387 containerd[1514]: time="2025-05-27T17:07:37.845320891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"198199a0257075aba8cacb6750cabb6c37d0c2177f651f2c7893777f35c1a998\" pid:5817 exited_at:{seconds:1748365657 nanos:844571245}" May 27 17:07:40.373233 kubelet[2795]: E0527 17:07:40.372971 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:07:44.375612 kubelet[2795]: E0527 17:07:44.375429 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:07:47.770503 containerd[1514]: time="2025-05-27T17:07:47.770215991Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"5f2945e106167d2dd7402121269496046874a78ce973eb2f7b0a346e7204b0be\" pid:5845 exited_at:{seconds:1748365667 nanos:769294024}" May 27 17:07:55.372404 kubelet[2795]: E0527 17:07:55.372134 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:07:55.374796 kubelet[2795]: E0527 17:07:55.374177 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:08:06.373056 kubelet[2795]: E0527 17:08:06.372827 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:08:07.498147 containerd[1514]: time="2025-05-27T17:08:07.498012624Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"b785b96191335c7f76d1cbad6437eb8d18cb5fd24d3149eeb4815447b8fa36e9\" pid:5871 exited_at:{seconds:1748365687 nanos:497173898}" May 27 17:08:07.841022 containerd[1514]: time="2025-05-27T17:08:07.840976803Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"15fa173ed02cfc82d6675561315bf77cbec9750566a7a8f8162b81bc294fe0f0\" pid:5895 exited_at:{seconds:1748365687 nanos:840597401}" May 27 17:08:09.376015 kubelet[2795]: E0527 17:08:09.375927 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:08:17.765146 containerd[1514]: time="2025-05-27T17:08:17.765096580Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"198a577db101019515766286ba34322a0fe4d614e77b85e83d12f054120d3edb\" pid:5924 exited_at:{seconds:1748365697 nanos:764606377}" May 27 17:08:21.373352 kubelet[2795]: E0527 17:08:21.373192 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:08:24.372754 containerd[1514]: time="2025-05-27T17:08:24.372706517Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:08:24.642241 containerd[1514]: time="2025-05-27T17:08:24.641629267Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:08:24.643541 containerd[1514]: time="2025-05-27T17:08:24.643401677Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:08:24.643541 containerd[1514]: time="2025-05-27T17:08:24.643419877Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:08:24.643766 kubelet[2795]: E0527 17:08:24.643706 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:08:24.644277 kubelet[2795]: E0527 17:08:24.643767 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:08:24.644277 kubelet[2795]: E0527 17:08:24.643945 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:46cc42382a97413a9376060056363be4,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:08:24.647824 containerd[1514]: time="2025-05-27T17:08:24.647550741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:08:24.885315 containerd[1514]: time="2025-05-27T17:08:24.885226751Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:08:24.887249 containerd[1514]: time="2025-05-27T17:08:24.887110922Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:08:24.887645 containerd[1514]: time="2025-05-27T17:08:24.887266203Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:08:24.887728 kubelet[2795]: E0527 17:08:24.887614 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:08:24.887728 kubelet[2795]: E0527 17:08:24.887661 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:08:24.887998 kubelet[2795]: E0527 17:08:24.887780 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7px9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-78c6556c4f-rjlzx_calico-system(432a2b34-eaf5-4f72-a2b6-f15f78b36b83): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:08:24.889245 kubelet[2795]: E0527 17:08:24.889196 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:08:34.374382 kubelet[2795]: E0527 17:08:34.372600 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:08:37.880404 containerd[1514]: time="2025-05-27T17:08:37.880323777Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"fef51ad8b59c64de23110e882c2242594da89d7ebcb88982996bb42c17eb20a5\" pid:5948 exited_at:{seconds:1748365717 nanos:879973055}" May 27 17:08:39.377057 kubelet[2795]: E0527 17:08:39.376668 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:08:47.761312 containerd[1514]: time="2025-05-27T17:08:47.761199270Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"c2cd274be42221da56e9d6602912fb8fc9c5b6999ebc9debfa04a491a01dea81\" pid:5979 exited_at:{seconds:1748365727 nanos:760927668}" May 27 17:08:48.373190 containerd[1514]: time="2025-05-27T17:08:48.373130038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:08:48.774351 containerd[1514]: time="2025-05-27T17:08:48.774198219Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:08:48.776045 containerd[1514]: time="2025-05-27T17:08:48.775987068Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:08:48.776221 containerd[1514]: time="2025-05-27T17:08:48.776092628Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:08:48.776966 kubelet[2795]: E0527 17:08:48.776508 2795 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:08:48.776966 kubelet[2795]: E0527 17:08:48.776594 2795 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:08:48.776966 kubelet[2795]: E0527 17:08:48.776806 2795 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mqjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-lflfc_calico-system(45d42809-0456-4761-94f0-815274f2dcfd): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:08:48.779246 kubelet[2795]: E0527 17:08:48.778544 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:08:54.376008 kubelet[2795]: E0527 17:08:54.375847 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:09:02.371757 kubelet[2795]: E0527 17:09:02.371711 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:09:07.494372 containerd[1514]: time="2025-05-27T17:09:07.494254647Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"55acaad0e84b040153260be2142a80e870ff1e393277734bfcadd60f31c322ee\" pid:6019 exited_at:{seconds:1748365747 nanos:493806565}" May 27 17:09:07.835516 containerd[1514]: time="2025-05-27T17:09:07.835476506Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"8b95c03bcf62dc1794cff3ab7652fde697388d9b5dd312bef6d04c3796a00f1a\" pid:6040 exited_at:{seconds:1748365747 nanos:834769383}" May 27 17:09:08.371987 kubelet[2795]: E0527 17:09:08.371895 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:09:15.374209 kubelet[2795]: E0527 17:09:15.373998 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:09:17.764826 containerd[1514]: time="2025-05-27T17:09:17.764774225Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"cfa36de89c6d11dcbf43a126e40a515b2dc53fe4d1b8ea29d7d135756ec6a82a\" pid:6064 exited_at:{seconds:1748365757 nanos:764420023}" May 27 17:09:19.372638 kubelet[2795]: E0527 17:09:19.372560 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:09:30.372177 kubelet[2795]: E0527 17:09:30.372077 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:09:32.372613 kubelet[2795]: E0527 17:09:32.372458 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:09:37.841124 containerd[1514]: time="2025-05-27T17:09:37.841076144Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"3fb7fada016d4271d7be5724d919475891e5ba19b57d43ad0ba95a281845d9ef\" pid:6087 exited_at:{seconds:1748365777 nanos:840706022}" May 27 17:09:44.372803 kubelet[2795]: E0527 17:09:44.372729 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:09:44.473745 systemd[1]: Started sshd@7-91.99.121.210:22-139.178.89.65:46770.service - OpenSSH per-connection server daemon (139.178.89.65:46770). May 27 17:09:45.478887 sshd[6102]: Accepted publickey for core from 139.178.89.65 port 46770 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:09:45.482046 sshd-session[6102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:09:45.488453 systemd-logind[1494]: New session 8 of user core. May 27 17:09:45.496946 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 17:09:45.761583 containerd[1514]: time="2025-05-27T17:09:45.761020332Z" level=warning msg="container event discarded" container=34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837 type=CONTAINER_CREATED_EVENT May 27 17:09:45.773788 containerd[1514]: time="2025-05-27T17:09:45.773698064Z" level=warning msg="container event discarded" container=34f35fa12a83acdc4fefa0ab71eef770667022779e525850bd9dc5b641ad4837 type=CONTAINER_STARTED_EVENT May 27 17:09:45.773788 containerd[1514]: time="2025-05-27T17:09:45.773751344Z" level=warning msg="container event discarded" container=293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed type=CONTAINER_CREATED_EVENT May 27 17:09:45.773788 containerd[1514]: time="2025-05-27T17:09:45.773761064Z" level=warning msg="container event discarded" container=293b0ad65b9430bfb79ada22779e8ef4e6be35bdca74ff8b4edb7fd881be57ed type=CONTAINER_STARTED_EVENT May 27 17:09:45.797711 containerd[1514]: time="2025-05-27T17:09:45.797577322Z" level=warning msg="container event discarded" container=e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273 type=CONTAINER_CREATED_EVENT May 27 17:09:45.797711 containerd[1514]: time="2025-05-27T17:09:45.797652002Z" level=warning msg="container event discarded" container=e67ad384d7e4e7c199d132db5b8791262ed332bd2f38f52c3ffbf54e174b1273 type=CONTAINER_STARTED_EVENT May 27 17:09:45.825861 containerd[1514]: time="2025-05-27T17:09:45.817968885Z" level=warning msg="container event discarded" container=bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50 type=CONTAINER_CREATED_EVENT May 27 17:09:45.844338 containerd[1514]: time="2025-05-27T17:09:45.844253913Z" level=warning msg="container event discarded" container=efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0 type=CONTAINER_CREATED_EVENT May 27 17:09:45.844338 containerd[1514]: time="2025-05-27T17:09:45.844307633Z" level=warning msg="container event discarded" container=a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43 type=CONTAINER_CREATED_EVENT May 27 17:09:45.943275 containerd[1514]: time="2025-05-27T17:09:45.943170599Z" level=warning msg="container event discarded" container=bce0ecd89f2bab7566655623627711076e71a587ce96bc69cf683a2223ac8b50 type=CONTAINER_STARTED_EVENT May 27 17:09:45.953573 containerd[1514]: time="2025-05-27T17:09:45.953500201Z" level=warning msg="container event discarded" container=efc129e75fe9d5e76829c40e6dcd37da636f1986f60d809a5377a3627b29b1e0 type=CONTAINER_STARTED_EVENT May 27 17:09:45.975828 containerd[1514]: time="2025-05-27T17:09:45.975743012Z" level=warning msg="container event discarded" container=a71cf8041440ad9ab46439b42dad96fb11a7ad05464a390a2f4fc168c5209a43 type=CONTAINER_STARTED_EVENT May 27 17:09:46.301400 sshd[6104]: Connection closed by 139.178.89.65 port 46770 May 27 17:09:46.302132 sshd-session[6102]: pam_unix(sshd:session): session closed for user core May 27 17:09:46.307903 systemd-logind[1494]: Session 8 logged out. Waiting for processes to exit. May 27 17:09:46.308747 systemd[1]: sshd@7-91.99.121.210:22-139.178.89.65:46770.service: Deactivated successfully. May 27 17:09:46.313711 systemd[1]: session-8.scope: Deactivated successfully. May 27 17:09:46.316905 systemd-logind[1494]: Removed session 8. May 27 17:09:46.374246 kubelet[2795]: E0527 17:09:46.374184 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:09:47.780733 containerd[1514]: time="2025-05-27T17:09:47.780688789Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"86934121d1c5acd923935fd0ad16f6cc6858564dbf0baf3f49b45ba6471fc3f8\" pid:6135 exited_at:{seconds:1748365787 nanos:780096346}" May 27 17:09:51.478602 systemd[1]: Started sshd@8-91.99.121.210:22-139.178.89.65:46786.service - OpenSSH per-connection server daemon (139.178.89.65:46786). May 27 17:09:52.470391 sshd[6147]: Accepted publickey for core from 139.178.89.65 port 46786 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:09:52.472754 sshd-session[6147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:09:52.478160 systemd-logind[1494]: New session 9 of user core. May 27 17:09:52.489604 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 17:09:53.248510 sshd[6150]: Connection closed by 139.178.89.65 port 46786 May 27 17:09:53.249905 sshd-session[6147]: pam_unix(sshd:session): session closed for user core May 27 17:09:53.257351 systemd[1]: sshd@8-91.99.121.210:22-139.178.89.65:46786.service: Deactivated successfully. May 27 17:09:53.262976 systemd[1]: session-9.scope: Deactivated successfully. May 27 17:09:53.265519 systemd-logind[1494]: Session 9 logged out. Waiting for processes to exit. May 27 17:09:53.268088 systemd-logind[1494]: Removed session 9. May 27 17:09:53.423586 systemd[1]: Started sshd@9-91.99.121.210:22-139.178.89.65:48844.service - OpenSSH per-connection server daemon (139.178.89.65:48844). May 27 17:09:54.429750 sshd[6163]: Accepted publickey for core from 139.178.89.65 port 48844 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:09:54.432646 sshd-session[6163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:09:54.439858 systemd-logind[1494]: New session 10 of user core. May 27 17:09:54.445723 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 17:09:55.241452 sshd[6165]: Connection closed by 139.178.89.65 port 48844 May 27 17:09:55.242588 sshd-session[6163]: pam_unix(sshd:session): session closed for user core May 27 17:09:55.247560 systemd-logind[1494]: Session 10 logged out. Waiting for processes to exit. May 27 17:09:55.248081 systemd[1]: sshd@9-91.99.121.210:22-139.178.89.65:48844.service: Deactivated successfully. May 27 17:09:55.252633 systemd[1]: session-10.scope: Deactivated successfully. May 27 17:09:55.257059 systemd-logind[1494]: Removed session 10. May 27 17:09:55.418257 systemd[1]: Started sshd@10-91.99.121.210:22-139.178.89.65:48856.service - OpenSSH per-connection server daemon (139.178.89.65:48856). May 27 17:09:56.432326 sshd[6175]: Accepted publickey for core from 139.178.89.65 port 48856 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:09:56.434552 sshd-session[6175]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:09:56.440493 systemd-logind[1494]: New session 11 of user core. May 27 17:09:56.450730 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 17:09:57.200174 sshd[6177]: Connection closed by 139.178.89.65 port 48856 May 27 17:09:57.200929 sshd-session[6175]: pam_unix(sshd:session): session closed for user core May 27 17:09:57.207312 systemd[1]: sshd@10-91.99.121.210:22-139.178.89.65:48856.service: Deactivated successfully. May 27 17:09:57.210255 systemd[1]: session-11.scope: Deactivated successfully. May 27 17:09:57.213520 systemd-logind[1494]: Session 11 logged out. Waiting for processes to exit. May 27 17:09:57.216417 systemd-logind[1494]: Removed session 11. May 27 17:09:57.374392 kubelet[2795]: E0527 17:09:57.373658 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:09:58.713748 containerd[1514]: time="2025-05-27T17:09:58.713274540Z" level=warning msg="container event discarded" container=4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c type=CONTAINER_CREATED_EVENT May 27 17:09:58.713748 containerd[1514]: time="2025-05-27T17:09:58.713359421Z" level=warning msg="container event discarded" container=4c0a359b0c64ae004c520b6bbf18759b3d9168e16af4c0949721f3e1d414bb7c type=CONTAINER_STARTED_EVENT May 27 17:09:58.754861 containerd[1514]: time="2025-05-27T17:09:58.754697105Z" level=warning msg="container event discarded" container=ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c type=CONTAINER_CREATED_EVENT May 27 17:09:58.827161 containerd[1514]: time="2025-05-27T17:09:58.827039193Z" level=warning msg="container event discarded" container=ad52d5190a1567bca77ebe056220bb50f25c37977966f4c992d5d8d1eee2c15c type=CONTAINER_STARTED_EVENT May 27 17:09:59.031765 containerd[1514]: time="2025-05-27T17:09:59.031484407Z" level=warning msg="container event discarded" container=f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353 type=CONTAINER_CREATED_EVENT May 27 17:09:59.031765 containerd[1514]: time="2025-05-27T17:09:59.031565447Z" level=warning msg="container event discarded" container=f2027ba5be70fd65e9e733157711c80dc4d3d6745e2d8a6449a78baf0bab0353 type=CONTAINER_STARTED_EVENT May 27 17:09:59.373213 kubelet[2795]: E0527 17:09:59.373082 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:10:02.147579 containerd[1514]: time="2025-05-27T17:10:02.147447640Z" level=warning msg="container event discarded" container=061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6 type=CONTAINER_CREATED_EVENT May 27 17:10:02.220967 containerd[1514]: time="2025-05-27T17:10:02.220875570Z" level=warning msg="container event discarded" container=061fdbc7144543133df746c0a3eb05c92f58f8ced145123cfb4c141890f73bc6 type=CONTAINER_STARTED_EVENT May 27 17:10:02.379742 systemd[1]: Started sshd@11-91.99.121.210:22-139.178.89.65:48866.service - OpenSSH per-connection server daemon (139.178.89.65:48866). May 27 17:10:03.408313 sshd[6195]: Accepted publickey for core from 139.178.89.65 port 48866 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:03.410558 sshd-session[6195]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:03.417497 systemd-logind[1494]: New session 12 of user core. May 27 17:10:03.425645 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 17:10:04.173891 sshd[6197]: Connection closed by 139.178.89.65 port 48866 May 27 17:10:04.174290 sshd-session[6195]: pam_unix(sshd:session): session closed for user core May 27 17:10:04.180166 systemd[1]: sshd@11-91.99.121.210:22-139.178.89.65:48866.service: Deactivated successfully. May 27 17:10:04.184017 systemd[1]: session-12.scope: Deactivated successfully. May 27 17:10:04.188000 systemd-logind[1494]: Session 12 logged out. Waiting for processes to exit. May 27 17:10:04.190291 systemd-logind[1494]: Removed session 12. May 27 17:10:07.514245 containerd[1514]: time="2025-05-27T17:10:07.514202981Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"7c2e01db7de2d7f1bc71939e64927bd575acbd4d7056146213c6443a5f51ae21\" pid:6221 exited_at:{seconds:1748365807 nanos:513699379}" May 27 17:10:07.839500 containerd[1514]: time="2025-05-27T17:10:07.839459654Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"46cec09355f886e709378f4c823318b05148aa67de0c49d86dbacce0a25da156\" pid:6244 exited_at:{seconds:1748365807 nanos:839157453}" May 27 17:10:09.347291 systemd[1]: Started sshd@12-91.99.121.210:22-139.178.89.65:56758.service - OpenSSH per-connection server daemon (139.178.89.65:56758). May 27 17:10:10.365877 sshd[6257]: Accepted publickey for core from 139.178.89.65 port 56758 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:10.367822 sshd-session[6257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:10.376270 kubelet[2795]: E0527 17:10:10.376068 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:10:10.377923 systemd-logind[1494]: New session 13 of user core. May 27 17:10:10.383235 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 17:10:11.139442 sshd[6259]: Connection closed by 139.178.89.65 port 56758 May 27 17:10:11.140523 sshd-session[6257]: pam_unix(sshd:session): session closed for user core May 27 17:10:11.148844 systemd[1]: sshd@12-91.99.121.210:22-139.178.89.65:56758.service: Deactivated successfully. May 27 17:10:11.152800 systemd[1]: session-13.scope: Deactivated successfully. May 27 17:10:11.155616 systemd-logind[1494]: Session 13 logged out. Waiting for processes to exit. May 27 17:10:11.158658 systemd-logind[1494]: Removed session 13. May 27 17:10:14.373342 kubelet[2795]: E0527 17:10:14.373208 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:10:16.305680 systemd[1]: Started sshd@13-91.99.121.210:22-139.178.89.65:38608.service - OpenSSH per-connection server daemon (139.178.89.65:38608). May 27 17:10:16.844877 containerd[1514]: time="2025-05-27T17:10:16.844809667Z" level=warning msg="container event discarded" container=a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893 type=CONTAINER_CREATED_EVENT May 27 17:10:16.844877 containerd[1514]: time="2025-05-27T17:10:16.844865507Z" level=warning msg="container event discarded" container=a0f7b4b06ed0fe2df1368e08cd4bd6cec4a7f17fcba6743bfd3aab3dbdcb4893 type=CONTAINER_STARTED_EVENT May 27 17:10:16.970905 containerd[1514]: time="2025-05-27T17:10:16.970804152Z" level=warning msg="container event discarded" container=66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c type=CONTAINER_CREATED_EVENT May 27 17:10:16.970905 containerd[1514]: time="2025-05-27T17:10:16.970890392Z" level=warning msg="container event discarded" container=66bdf6f18befaf96b85076e47dc7d6c036fd235152559d3d7c17b2387310e55c type=CONTAINER_STARTED_EVENT May 27 17:10:17.304669 sshd[6278]: Accepted publickey for core from 139.178.89.65 port 38608 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:17.306122 sshd-session[6278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:17.314466 systemd-logind[1494]: New session 14 of user core. May 27 17:10:17.320838 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 17:10:17.775682 containerd[1514]: time="2025-05-27T17:10:17.775143606Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"bc9a30bf0cc6205c95ba98c2dd0c5992e80f88b3d85bb60f2f20d5252014d369\" pid:6294 exited_at:{seconds:1748365817 nanos:774418843}" May 27 17:10:18.097953 sshd[6280]: Connection closed by 139.178.89.65 port 38608 May 27 17:10:18.097323 sshd-session[6278]: pam_unix(sshd:session): session closed for user core May 27 17:10:18.103965 systemd-logind[1494]: Session 14 logged out. Waiting for processes to exit. May 27 17:10:18.104195 systemd[1]: sshd@13-91.99.121.210:22-139.178.89.65:38608.service: Deactivated successfully. May 27 17:10:18.108939 systemd[1]: session-14.scope: Deactivated successfully. May 27 17:10:18.117504 systemd-logind[1494]: Removed session 14. May 27 17:10:18.264599 systemd[1]: Started sshd@14-91.99.121.210:22-139.178.89.65:38610.service - OpenSSH per-connection server daemon (139.178.89.65:38610). May 27 17:10:19.261086 sshd[6313]: Accepted publickey for core from 139.178.89.65 port 38610 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:19.263446 sshd-session[6313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:19.275471 systemd-logind[1494]: New session 15 of user core. May 27 17:10:19.281639 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 17:10:19.842941 containerd[1514]: time="2025-05-27T17:10:19.842738821Z" level=warning msg="container event discarded" container=8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956 type=CONTAINER_CREATED_EVENT May 27 17:10:19.927585 containerd[1514]: time="2025-05-27T17:10:19.927503066Z" level=warning msg="container event discarded" container=8c2e8ea1be64dc33d9d6cb44caff358f5bf7e876ce69b9e2d8e98a2d1db8c956 type=CONTAINER_STARTED_EVENT May 27 17:10:20.192727 sshd[6315]: Connection closed by 139.178.89.65 port 38610 May 27 17:10:20.193585 sshd-session[6313]: pam_unix(sshd:session): session closed for user core May 27 17:10:20.198909 systemd[1]: sshd@14-91.99.121.210:22-139.178.89.65:38610.service: Deactivated successfully. May 27 17:10:20.205152 systemd[1]: session-15.scope: Deactivated successfully. May 27 17:10:20.208732 systemd-logind[1494]: Session 15 logged out. Waiting for processes to exit. May 27 17:10:20.212258 systemd-logind[1494]: Removed session 15. May 27 17:10:20.364957 systemd[1]: Started sshd@15-91.99.121.210:22-139.178.89.65:38616.service - OpenSSH per-connection server daemon (139.178.89.65:38616). May 27 17:10:21.360046 sshd[6325]: Accepted publickey for core from 139.178.89.65 port 38616 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:21.363006 sshd-session[6325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:21.370219 containerd[1514]: time="2025-05-27T17:10:21.370123586Z" level=warning msg="container event discarded" container=58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67 type=CONTAINER_CREATED_EVENT May 27 17:10:21.374351 kubelet[2795]: E0527 17:10:21.374300 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:10:21.376745 systemd-logind[1494]: New session 16 of user core. May 27 17:10:21.381577 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 17:10:21.458946 containerd[1514]: time="2025-05-27T17:10:21.458824445Z" level=warning msg="container event discarded" container=58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67 type=CONTAINER_STARTED_EVENT May 27 17:10:21.650165 containerd[1514]: time="2025-05-27T17:10:21.649255813Z" level=warning msg="container event discarded" container=58b35d3283e171ec5c89b790e9018338ab87f8ed940fae6b10a6a7fd14df9f67 type=CONTAINER_STOPPED_EVENT May 27 17:10:23.215864 sshd[6327]: Connection closed by 139.178.89.65 port 38616 May 27 17:10:23.216909 sshd-session[6325]: pam_unix(sshd:session): session closed for user core May 27 17:10:23.225359 systemd-logind[1494]: Session 16 logged out. Waiting for processes to exit. May 27 17:10:23.227147 systemd[1]: sshd@15-91.99.121.210:22-139.178.89.65:38616.service: Deactivated successfully. May 27 17:10:23.231160 systemd[1]: session-16.scope: Deactivated successfully. May 27 17:10:23.233602 systemd-logind[1494]: Removed session 16. May 27 17:10:23.397683 systemd[1]: Started sshd@16-91.99.121.210:22-139.178.89.65:43714.service - OpenSSH per-connection server daemon (139.178.89.65:43714). May 27 17:10:24.432214 sshd[6344]: Accepted publickey for core from 139.178.89.65 port 43714 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:24.434273 sshd-session[6344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:24.440238 systemd-logind[1494]: New session 17 of user core. May 27 17:10:24.444649 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 17:10:25.332475 sshd[6346]: Connection closed by 139.178.89.65 port 43714 May 27 17:10:25.333457 sshd-session[6344]: pam_unix(sshd:session): session closed for user core May 27 17:10:25.339324 systemd-logind[1494]: Session 17 logged out. Waiting for processes to exit. May 27 17:10:25.339798 systemd[1]: sshd@16-91.99.121.210:22-139.178.89.65:43714.service: Deactivated successfully. May 27 17:10:25.342608 systemd[1]: session-17.scope: Deactivated successfully. May 27 17:10:25.346651 systemd-logind[1494]: Removed session 17. May 27 17:10:25.502839 systemd[1]: Started sshd@17-91.99.121.210:22-139.178.89.65:43730.service - OpenSSH per-connection server daemon (139.178.89.65:43730). May 27 17:10:26.235606 containerd[1514]: time="2025-05-27T17:10:26.235336719Z" level=warning msg="container event discarded" container=9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6 type=CONTAINER_CREATED_EVENT May 27 17:10:26.310920 containerd[1514]: time="2025-05-27T17:10:26.310841485Z" level=warning msg="container event discarded" container=9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6 type=CONTAINER_STARTED_EVENT May 27 17:10:26.517179 sshd[6356]: Accepted publickey for core from 139.178.89.65 port 43730 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:26.518547 sshd-session[6356]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:26.523960 systemd-logind[1494]: New session 18 of user core. May 27 17:10:26.529820 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 17:10:27.035541 containerd[1514]: time="2025-05-27T17:10:27.035453675Z" level=warning msg="container event discarded" container=9a22b6a718c8a7f7a1454f8808e3425f3fc7b5b7ffd49197550724bd6b5fcab6 type=CONTAINER_STOPPED_EVENT May 27 17:10:27.284542 sshd[6372]: Connection closed by 139.178.89.65 port 43730 May 27 17:10:27.283798 sshd-session[6356]: pam_unix(sshd:session): session closed for user core May 27 17:10:27.289782 systemd[1]: sshd@17-91.99.121.210:22-139.178.89.65:43730.service: Deactivated successfully. May 27 17:10:27.293357 systemd[1]: session-18.scope: Deactivated successfully. May 27 17:10:27.295558 systemd-logind[1494]: Session 18 logged out. Waiting for processes to exit. May 27 17:10:27.297498 systemd-logind[1494]: Removed session 18. May 27 17:10:28.372458 kubelet[2795]: E0527 17:10:28.372268 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:10:32.472178 systemd[1]: Started sshd@18-91.99.121.210:22-139.178.89.65:43746.service - OpenSSH per-connection server daemon (139.178.89.65:43746). May 27 17:10:33.377240 kubelet[2795]: E0527 17:10:33.377163 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:10:33.508528 sshd[6387]: Accepted publickey for core from 139.178.89.65 port 43746 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:33.510729 sshd-session[6387]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:33.515661 systemd-logind[1494]: New session 19 of user core. May 27 17:10:33.530049 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 17:10:34.138428 update_engine[1495]: I20250527 17:10:34.138095 1495 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 27 17:10:34.138428 update_engine[1495]: I20250527 17:10:34.138154 1495 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 27 17:10:34.139310 update_engine[1495]: I20250527 17:10:34.139012 1495 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 27 17:10:34.141194 update_engine[1495]: I20250527 17:10:34.141078 1495 omaha_request_params.cc:62] Current group set to alpha May 27 17:10:34.143243 update_engine[1495]: I20250527 17:10:34.142727 1495 update_attempter.cc:499] Already updated boot flags. Skipping. May 27 17:10:34.143243 update_engine[1495]: I20250527 17:10:34.143096 1495 update_attempter.cc:643] Scheduling an action processor start. May 27 17:10:34.143243 update_engine[1495]: I20250527 17:10:34.143137 1495 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 17:10:34.147555 update_engine[1495]: I20250527 17:10:34.147229 1495 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 27 17:10:34.147555 update_engine[1495]: I20250527 17:10:34.147351 1495 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 17:10:34.148661 update_engine[1495]: I20250527 17:10:34.147357 1495 omaha_request_action.cc:272] Request: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: May 27 17:10:34.148661 update_engine[1495]: I20250527 17:10:34.147887 1495 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:10:34.153069 update_engine[1495]: I20250527 17:10:34.152879 1495 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:10:34.153824 update_engine[1495]: I20250527 17:10:34.153788 1495 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:10:34.154816 locksmithd[1535]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 27 17:10:34.158006 update_engine[1495]: E20250527 17:10:34.157824 1495 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:10:34.158006 update_engine[1495]: I20250527 17:10:34.157915 1495 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 27 17:10:34.278997 sshd[6389]: Connection closed by 139.178.89.65 port 43746 May 27 17:10:34.279850 sshd-session[6387]: pam_unix(sshd:session): session closed for user core May 27 17:10:34.285303 systemd-logind[1494]: Session 19 logged out. Waiting for processes to exit. May 27 17:10:34.287823 systemd[1]: sshd@18-91.99.121.210:22-139.178.89.65:43746.service: Deactivated successfully. May 27 17:10:34.291801 systemd[1]: session-19.scope: Deactivated successfully. May 27 17:10:34.295150 systemd-logind[1494]: Removed session 19. May 27 17:10:34.585550 containerd[1514]: time="2025-05-27T17:10:34.585463676Z" level=warning msg="container event discarded" container=cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b type=CONTAINER_CREATED_EVENT May 27 17:10:34.702494 containerd[1514]: time="2025-05-27T17:10:34.702349115Z" level=warning msg="container event discarded" container=cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b type=CONTAINER_STARTED_EVENT May 27 17:10:36.496555 containerd[1514]: time="2025-05-27T17:10:36.496458601Z" level=warning msg="container event discarded" container=62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c type=CONTAINER_CREATED_EVENT May 27 17:10:36.496555 containerd[1514]: time="2025-05-27T17:10:36.496534921Z" level=warning msg="container event discarded" container=62e302d2d3207e263b7985e17f8817d97fbcf34a160d80136fd720ec17a0f64c type=CONTAINER_STARTED_EVENT May 27 17:10:37.938386 containerd[1514]: time="2025-05-27T17:10:37.938309757Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cda009b981a430a91a018edf4003835fa8c70b1a09be89fada3a9749c6375a3b\" id:\"2ec42e8f13f73399c8dd3749b13da0a1962b7d38c7365f69457870f163c4a43d\" pid:6412 exited_at:{seconds:1748365837 nanos:937969516}" May 27 17:10:38.756256 containerd[1514]: time="2025-05-27T17:10:38.756171533Z" level=warning msg="container event discarded" container=55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a type=CONTAINER_CREATED_EVENT May 27 17:10:38.756256 containerd[1514]: time="2025-05-27T17:10:38.756242733Z" level=warning msg="container event discarded" container=55a58b075ded2a6779bc4668428cc485acd0d5919c6b2d6883e78922cedbc34a type=CONTAINER_STARTED_EVENT May 27 17:10:39.459999 systemd[1]: Started sshd@19-91.99.121.210:22-139.178.89.65:45522.service - OpenSSH per-connection server daemon (139.178.89.65:45522). May 27 17:10:39.917812 containerd[1514]: time="2025-05-27T17:10:39.917649028Z" level=warning msg="container event discarded" container=fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc type=CONTAINER_CREATED_EVENT May 27 17:10:39.917812 containerd[1514]: time="2025-05-27T17:10:39.917728189Z" level=warning msg="container event discarded" container=fc4ccdd47fa619778917ea9a8b752d9ef1274238f57c957a55c52a812b91d9cc type=CONTAINER_STARTED_EVENT May 27 17:10:39.960702 containerd[1514]: time="2025-05-27T17:10:39.960503308Z" level=warning msg="container event discarded" container=32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c type=CONTAINER_CREATED_EVENT May 27 17:10:39.961045 containerd[1514]: time="2025-05-27T17:10:39.960978750Z" level=warning msg="container event discarded" container=32eafae9a5eb421ae1754a8899f98d9436cb3e0b5e4d3ebe9765ef47a813c90c type=CONTAINER_STARTED_EVENT May 27 17:10:40.024346 containerd[1514]: time="2025-05-27T17:10:40.024256986Z" level=warning msg="container event discarded" container=66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442 type=CONTAINER_CREATED_EVENT May 27 17:10:40.046907 containerd[1514]: time="2025-05-27T17:10:40.046821990Z" level=warning msg="container event discarded" container=becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e type=CONTAINER_CREATED_EVENT May 27 17:10:40.046907 containerd[1514]: time="2025-05-27T17:10:40.046889431Z" level=warning msg="container event discarded" container=becdf5b47f8aa7803eca62ad47ee8de2fe7f4783e461095127d002c7997eec7e type=CONTAINER_STARTED_EVENT May 27 17:10:40.101475 containerd[1514]: time="2025-05-27T17:10:40.101355234Z" level=warning msg="container event discarded" container=66060a88ec6c20c6e074fcf17f9bec4a27db5d1c65b04ab5d71acc527fab0442 type=CONTAINER_STARTED_EVENT May 27 17:10:40.473246 sshd[6423]: Accepted publickey for core from 139.178.89.65 port 45522 ssh2: RSA SHA256:1BpXOi1866ZaRYcIxpX/v1s76zFM3yHLr+cXft9FYB4 May 27 17:10:40.475961 sshd-session[6423]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:10:40.482220 systemd-logind[1494]: New session 20 of user core. May 27 17:10:40.490679 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 17:10:40.860997 containerd[1514]: time="2025-05-27T17:10:40.860881305Z" level=warning msg="container event discarded" container=37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d type=CONTAINER_CREATED_EVENT May 27 17:10:40.860997 containerd[1514]: time="2025-05-27T17:10:40.860954225Z" level=warning msg="container event discarded" container=37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d type=CONTAINER_STARTED_EVENT May 27 17:10:40.876482 containerd[1514]: time="2025-05-27T17:10:40.876140042Z" level=warning msg="container event discarded" container=0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448 type=CONTAINER_CREATED_EVENT May 27 17:10:40.876482 containerd[1514]: time="2025-05-27T17:10:40.876215722Z" level=warning msg="container event discarded" container=0e9a79663c0117d80508e9511f41f23034240bbbb107409671018f6c50379448 type=CONTAINER_STARTED_EVENT May 27 17:10:40.905577 containerd[1514]: time="2025-05-27T17:10:40.905468151Z" level=warning msg="container event discarded" container=29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c type=CONTAINER_CREATED_EVENT May 27 17:10:40.965921 containerd[1514]: time="2025-05-27T17:10:40.965798056Z" level=warning msg="container event discarded" container=29a443f4ebca27f974b30a9d470e56fadbbdc220b17c9e4b81e25bc89b0b975c type=CONTAINER_STARTED_EVENT May 27 17:10:41.247911 sshd[6425]: Connection closed by 139.178.89.65 port 45522 May 27 17:10:41.247608 sshd-session[6423]: pam_unix(sshd:session): session closed for user core May 27 17:10:41.252793 systemd-logind[1494]: Session 20 logged out. Waiting for processes to exit. May 27 17:10:41.253521 systemd[1]: sshd@19-91.99.121.210:22-139.178.89.65:45522.service: Deactivated successfully. May 27 17:10:41.256476 systemd[1]: session-20.scope: Deactivated successfully. May 27 17:10:41.258737 systemd-logind[1494]: Removed session 20. May 27 17:10:41.913994 containerd[1514]: time="2025-05-27T17:10:41.913815586Z" level=warning msg="container event discarded" container=0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa type=CONTAINER_CREATED_EVENT May 27 17:10:41.913994 containerd[1514]: time="2025-05-27T17:10:41.913945546Z" level=warning msg="container event discarded" container=0ff48fb5f8efd7d38ee9f2c568b5feb2fc3eb4c687cda1c44dd95fa98e239dfa type=CONTAINER_STARTED_EVENT May 27 17:10:42.726987 containerd[1514]: time="2025-05-27T17:10:42.726906410Z" level=warning msg="container event discarded" container=111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb type=CONTAINER_CREATED_EVENT May 27 17:10:42.900194 containerd[1514]: time="2025-05-27T17:10:42.900057174Z" level=warning msg="container event discarded" container=111be39bb411d19978efecf2178d33ea1690144ed1176e97d6d76dd95117aacb type=CONTAINER_STARTED_EVENT May 27 17:10:43.024114 containerd[1514]: time="2025-05-27T17:10:43.023322672Z" level=warning msg="container event discarded" container=c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89 type=CONTAINER_CREATED_EVENT May 27 17:10:43.024114 containerd[1514]: time="2025-05-27T17:10:43.023643834Z" level=warning msg="container event discarded" container=c3aba880c0db095a01c58b9a29b57644a257c08c5f3a33018a3272dc0f8e8a89 type=CONTAINER_STARTED_EVENT May 27 17:10:43.373390 kubelet[2795]: E0527 17:10:43.372408 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-lflfc" podUID="45d42809-0456-4761-94f0-815274f2dcfd" May 27 17:10:44.138326 update_engine[1495]: I20250527 17:10:44.138216 1495 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:10:44.139097 update_engine[1495]: I20250527 17:10:44.138489 1495 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:10:44.139097 update_engine[1495]: I20250527 17:10:44.138793 1495 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:10:44.139214 update_engine[1495]: E20250527 17:10:44.139137 1495 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:10:44.139214 update_engine[1495]: I20250527 17:10:44.139206 1495 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 27 17:10:44.373593 kubelet[2795]: E0527 17:10:44.373429 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-78c6556c4f-rjlzx" podUID="432a2b34-eaf5-4f72-a2b6-f15f78b36b83" May 27 17:10:47.415327 containerd[1514]: time="2025-05-27T17:10:47.414755716Z" level=warning msg="container event discarded" container=6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a type=CONTAINER_CREATED_EVENT May 27 17:10:47.494419 containerd[1514]: time="2025-05-27T17:10:47.494301531Z" level=warning msg="container event discarded" container=6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a type=CONTAINER_STARTED_EVENT May 27 17:10:47.777352 containerd[1514]: time="2025-05-27T17:10:47.776490814Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6364e7ad2186f42e8886a4647c12c9e5d495f3ce9e8c2f93758c44b979d6a83a\" id:\"8e81daad87d27d213ccaf0db6d62aeec42de26ec26aef49aca1e53fe0aa253a1\" pid:6448 exited_at:{seconds:1748365847 nanos:775331210}" May 27 17:10:49.051000 containerd[1514]: time="2025-05-27T17:10:49.050893045Z" level=warning msg="container event discarded" container=006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072 type=CONTAINER_CREATED_EVENT May 27 17:10:49.172521 containerd[1514]: time="2025-05-27T17:10:49.172435933Z" level=warning msg="container event discarded" container=006734d9f66075082cb4e7a2a1a95f105d3a8ce39643488ff52ceaa8b38b2072 type=CONTAINER_STARTED_EVENT May 27 17:10:49.422684 containerd[1514]: time="2025-05-27T17:10:49.422453456Z" level=warning msg="container event discarded" container=695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba type=CONTAINER_CREATED_EVENT May 27 17:10:49.531329 containerd[1514]: time="2025-05-27T17:10:49.531027017Z" level=warning msg="container event discarded" container=695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba type=CONTAINER_STARTED_EVENT May 27 17:10:49.928561 containerd[1514]: time="2025-05-27T17:10:49.928474364Z" level=warning msg="container event discarded" container=1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122 type=CONTAINER_CREATED_EVENT May 27 17:10:50.035974 containerd[1514]: time="2025-05-27T17:10:50.035866681Z" level=warning msg="container event discarded" container=1cd5f2e3edc3c8abb3921e5b38f34614d1c25926fd439de9721ba0f6c71bf122 type=CONTAINER_STARTED_EVENT May 27 17:10:50.053291 containerd[1514]: time="2025-05-27T17:10:50.053206585Z" level=warning msg="container event discarded" container=695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba type=CONTAINER_STOPPED_EVENT May 27 17:10:50.206555 containerd[1514]: time="2025-05-27T17:10:50.206371870Z" level=warning msg="container event discarded" container=5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11 type=CONTAINER_CREATED_EVENT May 27 17:10:50.206555 containerd[1514]: time="2025-05-27T17:10:50.206440950Z" level=warning msg="container event discarded" container=5a95cabc4380cab6012b6e9011f2287375d3dc6d6f81da8c783e3ca3b47d1f11 type=CONTAINER_STARTED_EVENT May 27 17:10:50.286140 containerd[1514]: time="2025-05-27T17:10:50.286032363Z" level=warning msg="container event discarded" container=7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632 type=CONTAINER_CREATED_EVENT May 27 17:10:50.306674 containerd[1514]: time="2025-05-27T17:10:50.306580759Z" level=warning msg="container event discarded" container=37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d type=CONTAINER_STOPPED_EVENT May 27 17:10:50.493659 containerd[1514]: time="2025-05-27T17:10:50.493435608Z" level=warning msg="container event discarded" container=7abfe26d7cbf26332ce145533b269156b54846461a143fa7687d6dd0b3126632 type=CONTAINER_STARTED_EVENT May 27 17:10:50.853667 containerd[1514]: time="2025-05-27T17:10:50.853481776Z" level=warning msg="container event discarded" container=695e6e3f860d9e38c6f5bb78028dab3de557eec118706c83a40734d38cc9e5ba type=CONTAINER_DELETED_EVENT May 27 17:10:51.711770 containerd[1514]: time="2025-05-27T17:10:51.711625859Z" level=warning msg="container event discarded" container=37dd4ee719334f4f20a076c1d1945fa38c464de1116e68137db42bbd11b8398d type=CONTAINER_DELETED_EVENT May 27 17:10:52.505743 containerd[1514]: time="2025-05-27T17:10:52.505636302Z" level=warning msg="container event discarded" container=cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f type=CONTAINER_CREATED_EVENT May 27 17:10:52.634581 containerd[1514]: time="2025-05-27T17:10:52.634511537Z" level=warning msg="container event discarded" container=cfe34f59449361685a0a77e9b68f936745f36ae5c7172717a4ecf528d636880f type=CONTAINER_STARTED_EVENT May 27 17:10:54.147161 update_engine[1495]: I20250527 17:10:54.147002 1495 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:10:54.148011 update_engine[1495]: I20250527 17:10:54.147526 1495 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:10:54.148011 update_engine[1495]: I20250527 17:10:54.147959 1495 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:10:54.148678 update_engine[1495]: E20250527 17:10:54.148622 1495 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:10:54.148795 update_engine[1495]: I20250527 17:10:54.148718 1495 libcurl_http_fetcher.cc:283] No HTTP response, retry 3