May 27 17:19:08.042262 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] May 27 17:19:08.042304 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 27 15:31:23 -00 2025 May 27 17:19:08.042328 kernel: KASLR disabled due to lack of seed May 27 17:19:08.042344 kernel: efi: EFI v2.7 by EDK II May 27 17:19:08.042360 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7a733a98 MEMRESERVE=0x78551598 May 27 17:19:08.042375 kernel: secureboot: Secure boot disabled May 27 17:19:08.042392 kernel: ACPI: Early table checksum verification disabled May 27 17:19:08.042406 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) May 27 17:19:08.042422 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) May 27 17:19:08.042437 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) May 27 17:19:08.042456 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) May 27 17:19:08.042472 kernel: ACPI: FACS 0x0000000078630000 000040 May 27 17:19:08.042487 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) May 27 17:19:08.042502 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) May 27 17:19:08.042520 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) May 27 17:19:08.042537 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) May 27 17:19:08.042556 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) May 27 17:19:08.042572 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) May 27 17:19:08.042588 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) May 27 17:19:08.042603 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 May 27 17:19:08.042619 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') May 27 17:19:08.042634 kernel: printk: legacy bootconsole [uart0] enabled May 27 17:19:08.042650 kernel: ACPI: Use ACPI SPCR as default console: Yes May 27 17:19:08.042666 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] May 27 17:19:08.042681 kernel: NODE_DATA(0) allocated [mem 0x4b584cdc0-0x4b5853fff] May 27 17:19:08.042697 kernel: Zone ranges: May 27 17:19:08.042718 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] May 27 17:19:08.042736 kernel: DMA32 empty May 27 17:19:08.042752 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] May 27 17:19:08.042767 kernel: Device empty May 27 17:19:08.042782 kernel: Movable zone start for each node May 27 17:19:08.042798 kernel: Early memory node ranges May 27 17:19:08.042813 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] May 27 17:19:08.042829 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] May 27 17:19:08.042844 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] May 27 17:19:08.042860 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] May 27 17:19:08.042875 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] May 27 17:19:08.042919 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] May 27 17:19:08.042941 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] May 27 17:19:08.042958 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] May 27 17:19:08.042980 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] May 27 17:19:08.042997 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges May 27 17:19:08.043013 kernel: psci: probing for conduit method from ACPI. May 27 17:19:08.043033 kernel: psci: PSCIv1.0 detected in firmware. May 27 17:19:08.043050 kernel: psci: Using standard PSCI v0.2 function IDs May 27 17:19:08.043066 kernel: psci: Trusted OS migration not required May 27 17:19:08.043082 kernel: psci: SMC Calling Convention v1.1 May 27 17:19:08.043098 kernel: percpu: Embedded 33 pages/cpu s98136 r8192 d28840 u135168 May 27 17:19:08.043115 kernel: pcpu-alloc: s98136 r8192 d28840 u135168 alloc=33*4096 May 27 17:19:08.043131 kernel: pcpu-alloc: [0] 0 [0] 1 May 27 17:19:08.043147 kernel: Detected PIPT I-cache on CPU0 May 27 17:19:08.043164 kernel: CPU features: detected: GIC system register CPU interface May 27 17:19:08.043180 kernel: CPU features: detected: Spectre-v2 May 27 17:19:08.043196 kernel: CPU features: detected: Spectre-v3a May 27 17:19:08.043213 kernel: CPU features: detected: Spectre-BHB May 27 17:19:08.043233 kernel: CPU features: detected: ARM erratum 1742098 May 27 17:19:08.043249 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 May 27 17:19:08.043265 kernel: alternatives: applying boot alternatives May 27 17:19:08.043284 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=4e706b869299e1c88703222069cdfa08c45ebce568f762053eea5b3f5f0939c3 May 27 17:19:08.043301 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 17:19:08.043318 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 17:19:08.043334 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 17:19:08.043350 kernel: Fallback order for Node 0: 0 May 27 17:19:08.043367 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1007616 May 27 17:19:08.043383 kernel: Policy zone: Normal May 27 17:19:08.043403 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 17:19:08.043420 kernel: software IO TLB: area num 2. May 27 17:19:08.043436 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) May 27 17:19:08.043452 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 27 17:19:08.043468 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 17:19:08.043485 kernel: rcu: RCU event tracing is enabled. May 27 17:19:08.043502 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 27 17:19:08.043519 kernel: Trampoline variant of Tasks RCU enabled. May 27 17:19:08.043536 kernel: Tracing variant of Tasks RCU enabled. May 27 17:19:08.043552 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 17:19:08.043568 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 27 17:19:08.043585 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 17:19:08.043606 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 17:19:08.043622 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 27 17:19:08.043638 kernel: GICv3: 96 SPIs implemented May 27 17:19:08.043654 kernel: GICv3: 0 Extended SPIs implemented May 27 17:19:08.043670 kernel: Root IRQ handler: gic_handle_irq May 27 17:19:08.043686 kernel: GICv3: GICv3 features: 16 PPIs May 27 17:19:08.043702 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 May 27 17:19:08.043718 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 May 27 17:19:08.043735 kernel: ITS [mem 0x10080000-0x1009ffff] May 27 17:19:08.043751 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000c0000 (indirect, esz 8, psz 64K, shr 1) May 27 17:19:08.043768 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000d0000 (flat, esz 8, psz 64K, shr 1) May 27 17:19:08.043788 kernel: GICv3: using LPI property table @0x00000004000e0000 May 27 17:19:08.043804 kernel: ITS: Using hypervisor restricted LPI range [128] May 27 17:19:08.043820 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000f0000 May 27 17:19:08.043837 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 17:19:08.043853 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). May 27 17:19:08.043870 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns May 27 17:19:08.043915 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns May 27 17:19:08.043934 kernel: Console: colour dummy device 80x25 May 27 17:19:08.043952 kernel: printk: legacy console [tty1] enabled May 27 17:19:08.043969 kernel: ACPI: Core revision 20240827 May 27 17:19:08.043986 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) May 27 17:19:08.044008 kernel: pid_max: default: 32768 minimum: 301 May 27 17:19:08.044025 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 17:19:08.044042 kernel: landlock: Up and running. May 27 17:19:08.044058 kernel: SELinux: Initializing. May 27 17:19:08.044074 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 17:19:08.044091 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 17:19:08.044108 kernel: rcu: Hierarchical SRCU implementation. May 27 17:19:08.044125 kernel: rcu: Max phase no-delay instances is 400. May 27 17:19:08.044142 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 17:19:08.044162 kernel: Remapping and enabling EFI services. May 27 17:19:08.044179 kernel: smp: Bringing up secondary CPUs ... May 27 17:19:08.044195 kernel: Detected PIPT I-cache on CPU1 May 27 17:19:08.044212 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 May 27 17:19:08.044229 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000400100000 May 27 17:19:08.044246 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] May 27 17:19:08.044266 kernel: smp: Brought up 1 node, 2 CPUs May 27 17:19:08.044283 kernel: SMP: Total of 2 processors activated. May 27 17:19:08.044299 kernel: CPU: All CPU(s) started at EL1 May 27 17:19:08.044320 kernel: CPU features: detected: 32-bit EL0 Support May 27 17:19:08.044348 kernel: CPU features: detected: 32-bit EL1 Support May 27 17:19:08.044365 kernel: CPU features: detected: CRC32 instructions May 27 17:19:08.044386 kernel: alternatives: applying system-wide alternatives May 27 17:19:08.044405 kernel: Memory: 3813536K/4030464K available (11072K kernel code, 2276K rwdata, 8936K rodata, 39424K init, 1034K bss, 212156K reserved, 0K cma-reserved) May 27 17:19:08.044422 kernel: devtmpfs: initialized May 27 17:19:08.044439 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 17:19:08.044457 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 27 17:19:08.044479 kernel: 17024 pages in range for non-PLT usage May 27 17:19:08.044496 kernel: 508544 pages in range for PLT usage May 27 17:19:08.044513 kernel: pinctrl core: initialized pinctrl subsystem May 27 17:19:08.044531 kernel: SMBIOS 3.0.0 present. May 27 17:19:08.044548 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 May 27 17:19:08.044565 kernel: DMI: Memory slots populated: 0/0 May 27 17:19:08.044583 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 17:19:08.044600 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 27 17:19:08.044618 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 27 17:19:08.044640 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 27 17:19:08.044657 kernel: audit: initializing netlink subsys (disabled) May 27 17:19:08.044674 kernel: audit: type=2000 audit(0.227:1): state=initialized audit_enabled=0 res=1 May 27 17:19:08.044692 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 17:19:08.044709 kernel: cpuidle: using governor menu May 27 17:19:08.044726 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 27 17:19:08.044744 kernel: ASID allocator initialised with 65536 entries May 27 17:19:08.044761 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 17:19:08.044782 kernel: Serial: AMBA PL011 UART driver May 27 17:19:08.044800 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 17:19:08.044817 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 27 17:19:08.044835 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 27 17:19:08.044852 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 27 17:19:08.044869 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 17:19:08.046944 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 27 17:19:08.046974 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 27 17:19:08.046993 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 27 17:19:08.047019 kernel: ACPI: Added _OSI(Module Device) May 27 17:19:08.047037 kernel: ACPI: Added _OSI(Processor Device) May 27 17:19:08.047055 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 17:19:08.047073 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 17:19:08.047090 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 17:19:08.047108 kernel: ACPI: Interpreter enabled May 27 17:19:08.047125 kernel: ACPI: Using GIC for interrupt routing May 27 17:19:08.047142 kernel: ACPI: MCFG table detected, 1 entries May 27 17:19:08.047160 kernel: ACPI: CPU0 has been hot-added May 27 17:19:08.047177 kernel: ACPI: CPU1 has been hot-added May 27 17:19:08.047199 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) May 27 17:19:08.047503 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 17:19:08.050127 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 27 17:19:08.050373 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 27 17:19:08.050567 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 May 27 17:19:08.050759 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] May 27 17:19:08.050784 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] May 27 17:19:08.050814 kernel: acpiphp: Slot [1] registered May 27 17:19:08.050833 kernel: acpiphp: Slot [2] registered May 27 17:19:08.050851 kernel: acpiphp: Slot [3] registered May 27 17:19:08.050868 kernel: acpiphp: Slot [4] registered May 27 17:19:08.050906 kernel: acpiphp: Slot [5] registered May 27 17:19:08.050926 kernel: acpiphp: Slot [6] registered May 27 17:19:08.050945 kernel: acpiphp: Slot [7] registered May 27 17:19:08.050962 kernel: acpiphp: Slot [8] registered May 27 17:19:08.050980 kernel: acpiphp: Slot [9] registered May 27 17:19:08.051002 kernel: acpiphp: Slot [10] registered May 27 17:19:08.051020 kernel: acpiphp: Slot [11] registered May 27 17:19:08.051037 kernel: acpiphp: Slot [12] registered May 27 17:19:08.051055 kernel: acpiphp: Slot [13] registered May 27 17:19:08.051072 kernel: acpiphp: Slot [14] registered May 27 17:19:08.051089 kernel: acpiphp: Slot [15] registered May 27 17:19:08.051106 kernel: acpiphp: Slot [16] registered May 27 17:19:08.051123 kernel: acpiphp: Slot [17] registered May 27 17:19:08.051141 kernel: acpiphp: Slot [18] registered May 27 17:19:08.051158 kernel: acpiphp: Slot [19] registered May 27 17:19:08.051179 kernel: acpiphp: Slot [20] registered May 27 17:19:08.051197 kernel: acpiphp: Slot [21] registered May 27 17:19:08.051214 kernel: acpiphp: Slot [22] registered May 27 17:19:08.051231 kernel: acpiphp: Slot [23] registered May 27 17:19:08.051248 kernel: acpiphp: Slot [24] registered May 27 17:19:08.051265 kernel: acpiphp: Slot [25] registered May 27 17:19:08.051283 kernel: acpiphp: Slot [26] registered May 27 17:19:08.051300 kernel: acpiphp: Slot [27] registered May 27 17:19:08.051318 kernel: acpiphp: Slot [28] registered May 27 17:19:08.051339 kernel: acpiphp: Slot [29] registered May 27 17:19:08.051356 kernel: acpiphp: Slot [30] registered May 27 17:19:08.051373 kernel: acpiphp: Slot [31] registered May 27 17:19:08.051391 kernel: PCI host bridge to bus 0000:00 May 27 17:19:08.051602 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] May 27 17:19:08.051780 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] May 27 17:19:08.053172 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] May 27 17:19:08.053371 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] May 27 17:19:08.053609 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 conventional PCI endpoint May 27 17:19:08.053839 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 conventional PCI endpoint May 27 17:19:08.054079 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff] May 27 17:19:08.054330 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 PCIe Root Complex Integrated Endpoint May 27 17:19:08.054530 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80114000-0x80117fff] May 27 17:19:08.054725 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 17:19:08.058016 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 PCIe Root Complex Integrated Endpoint May 27 17:19:08.058289 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80110000-0x80113fff] May 27 17:19:08.058493 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref] May 27 17:19:08.058692 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff] May 27 17:19:08.058906 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 17:19:08.059113 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref]: assigned May 27 17:19:08.059310 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff]: assigned May 27 17:19:08.059527 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80110000-0x80113fff]: assigned May 27 17:19:08.059726 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80114000-0x80117fff]: assigned May 27 17:19:08.062015 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff]: assigned May 27 17:19:08.062266 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] May 27 17:19:08.062449 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] May 27 17:19:08.062628 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] May 27 17:19:08.062653 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 May 27 17:19:08.062681 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 May 27 17:19:08.062699 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 May 27 17:19:08.062717 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 May 27 17:19:08.062735 kernel: iommu: Default domain type: Translated May 27 17:19:08.062752 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 27 17:19:08.062770 kernel: efivars: Registered efivars operations May 27 17:19:08.062787 kernel: vgaarb: loaded May 27 17:19:08.062804 kernel: clocksource: Switched to clocksource arch_sys_counter May 27 17:19:08.062822 kernel: VFS: Disk quotas dquot_6.6.0 May 27 17:19:08.062843 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 17:19:08.062862 kernel: pnp: PnP ACPI init May 27 17:19:08.063105 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved May 27 17:19:08.063132 kernel: pnp: PnP ACPI: found 1 devices May 27 17:19:08.063151 kernel: NET: Registered PF_INET protocol family May 27 17:19:08.063170 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 27 17:19:08.063189 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 27 17:19:08.063207 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 17:19:08.063230 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 17:19:08.063250 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 27 17:19:08.063268 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 27 17:19:08.063285 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 17:19:08.063303 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 17:19:08.063321 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 17:19:08.063338 kernel: PCI: CLS 0 bytes, default 64 May 27 17:19:08.063356 kernel: kvm [1]: HYP mode not available May 27 17:19:08.063373 kernel: Initialise system trusted keyrings May 27 17:19:08.063395 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 27 17:19:08.063412 kernel: Key type asymmetric registered May 27 17:19:08.063429 kernel: Asymmetric key parser 'x509' registered May 27 17:19:08.063447 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) May 27 17:19:08.063464 kernel: io scheduler mq-deadline registered May 27 17:19:08.063482 kernel: io scheduler kyber registered May 27 17:19:08.063499 kernel: io scheduler bfq registered May 27 17:19:08.063710 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered May 27 17:19:08.063737 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 27 17:19:08.063759 kernel: ACPI: button: Power Button [PWRB] May 27 17:19:08.063777 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 May 27 17:19:08.063795 kernel: ACPI: button: Sleep Button [SLPB] May 27 17:19:08.063813 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 17:19:08.063831 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 May 27 17:19:08.066123 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) May 27 17:19:08.066186 kernel: printk: legacy console [ttyS0] disabled May 27 17:19:08.066208 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A May 27 17:19:08.066237 kernel: printk: legacy console [ttyS0] enabled May 27 17:19:08.066256 kernel: printk: legacy bootconsole [uart0] disabled May 27 17:19:08.066276 kernel: thunder_xcv, ver 1.0 May 27 17:19:08.066296 kernel: thunder_bgx, ver 1.0 May 27 17:19:08.066315 kernel: nicpf, ver 1.0 May 27 17:19:08.066333 kernel: nicvf, ver 1.0 May 27 17:19:08.066592 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 27 17:19:08.066806 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-27T17:19:07 UTC (1748366347) May 27 17:19:08.066835 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 17:19:08.066863 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 (0,80000003) counters available May 27 17:19:08.066916 kernel: NET: Registered PF_INET6 protocol family May 27 17:19:08.066940 kernel: watchdog: NMI not fully supported May 27 17:19:08.066958 kernel: watchdog: Hard watchdog permanently disabled May 27 17:19:08.066976 kernel: Segment Routing with IPv6 May 27 17:19:08.066994 kernel: In-situ OAM (IOAM) with IPv6 May 27 17:19:08.067012 kernel: NET: Registered PF_PACKET protocol family May 27 17:19:08.067030 kernel: Key type dns_resolver registered May 27 17:19:08.067049 kernel: registered taskstats version 1 May 27 17:19:08.067073 kernel: Loading compiled-in X.509 certificates May 27 17:19:08.067091 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 8e5e45c34fa91568ef1fa3bdfd5a71a43b4c4580' May 27 17:19:08.067110 kernel: Demotion targets for Node 0: null May 27 17:19:08.067128 kernel: Key type .fscrypt registered May 27 17:19:08.067145 kernel: Key type fscrypt-provisioning registered May 27 17:19:08.067162 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 17:19:08.067180 kernel: ima: Allocated hash algorithm: sha1 May 27 17:19:08.067198 kernel: ima: No architecture policies found May 27 17:19:08.067216 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 27 17:19:08.067239 kernel: clk: Disabling unused clocks May 27 17:19:08.067256 kernel: PM: genpd: Disabling unused power domains May 27 17:19:08.067274 kernel: Warning: unable to open an initial console. May 27 17:19:08.067292 kernel: Freeing unused kernel memory: 39424K May 27 17:19:08.067309 kernel: Run /init as init process May 27 17:19:08.067327 kernel: with arguments: May 27 17:19:08.067344 kernel: /init May 27 17:19:08.067362 kernel: with environment: May 27 17:19:08.067379 kernel: HOME=/ May 27 17:19:08.067400 kernel: TERM=linux May 27 17:19:08.067419 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 17:19:08.067439 systemd[1]: Successfully made /usr/ read-only. May 27 17:19:08.067463 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:19:08.067485 systemd[1]: Detected virtualization amazon. May 27 17:19:08.067504 systemd[1]: Detected architecture arm64. May 27 17:19:08.067522 systemd[1]: Running in initrd. May 27 17:19:08.067546 systemd[1]: No hostname configured, using default hostname. May 27 17:19:08.067566 systemd[1]: Hostname set to . May 27 17:19:08.067584 systemd[1]: Initializing machine ID from VM UUID. May 27 17:19:08.067603 systemd[1]: Queued start job for default target initrd.target. May 27 17:19:08.067622 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:19:08.067641 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:19:08.067662 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 17:19:08.067682 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:19:08.067706 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 17:19:08.067728 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 17:19:08.067750 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 17:19:08.067769 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 17:19:08.067788 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:19:08.067808 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:19:08.067827 systemd[1]: Reached target paths.target - Path Units. May 27 17:19:08.067850 systemd[1]: Reached target slices.target - Slice Units. May 27 17:19:08.067870 systemd[1]: Reached target swap.target - Swaps. May 27 17:19:08.069955 systemd[1]: Reached target timers.target - Timer Units. May 27 17:19:08.069987 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:19:08.070008 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:19:08.070028 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 17:19:08.070048 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 17:19:08.070068 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:19:08.070087 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:19:08.070117 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:19:08.070136 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:19:08.070174 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 17:19:08.070197 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:19:08.070217 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 17:19:08.070237 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 17:19:08.070258 systemd[1]: Starting systemd-fsck-usr.service... May 27 17:19:08.070277 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:19:08.070303 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:19:08.070323 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:19:08.070342 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 17:19:08.070413 systemd-journald[257]: Collecting audit messages is disabled. May 27 17:19:08.070461 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:19:08.070482 systemd[1]: Finished systemd-fsck-usr.service. May 27 17:19:08.070503 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:19:08.070522 systemd-journald[257]: Journal started May 27 17:19:08.070563 systemd-journald[257]: Runtime Journal (/run/log/journal/ec287a0179ec021a8cbbc0d9bd35d2d0) is 8M, max 75.3M, 67.3M free. May 27 17:19:08.079291 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:19:08.080390 systemd-modules-load[259]: Inserted module 'overlay' May 27 17:19:08.098830 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:19:08.112679 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 17:19:08.115033 systemd-modules-load[259]: Inserted module 'br_netfilter' May 27 17:19:08.119364 kernel: Bridge firewalling registered May 27 17:19:08.122693 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:19:08.133425 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:19:08.139182 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:19:08.149154 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 17:19:08.151953 systemd-tmpfiles[270]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 17:19:08.177143 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:19:08.181903 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:19:08.185384 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:19:08.209808 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:19:08.219244 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 17:19:08.228665 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:19:08.235586 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:19:08.242997 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:19:08.284321 dracut-cmdline[295]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=4e706b869299e1c88703222069cdfa08c45ebce568f762053eea5b3f5f0939c3 May 27 17:19:08.327551 systemd-resolved[298]: Positive Trust Anchors: May 27 17:19:08.327960 systemd-resolved[298]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:19:08.328024 systemd-resolved[298]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:19:08.437924 kernel: SCSI subsystem initialized May 27 17:19:08.444915 kernel: Loading iSCSI transport class v2.0-870. May 27 17:19:08.458035 kernel: iscsi: registered transport (tcp) May 27 17:19:08.478944 kernel: iscsi: registered transport (qla4xxx) May 27 17:19:08.479025 kernel: QLogic iSCSI HBA Driver May 27 17:19:08.511049 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:19:08.544977 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:19:08.555593 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:19:08.590900 kernel: random: crng init done May 27 17:19:08.589187 systemd-resolved[298]: Defaulting to hostname 'linux'. May 27 17:19:08.590969 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:19:08.594928 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:19:08.636273 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 17:19:08.642109 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 17:19:08.739944 kernel: raid6: neonx8 gen() 6495 MB/s May 27 17:19:08.756918 kernel: raid6: neonx4 gen() 6530 MB/s May 27 17:19:08.773916 kernel: raid6: neonx2 gen() 5446 MB/s May 27 17:19:08.790913 kernel: raid6: neonx1 gen() 3943 MB/s May 27 17:19:08.807913 kernel: raid6: int64x8 gen() 3628 MB/s May 27 17:19:08.824912 kernel: raid6: int64x4 gen() 3720 MB/s May 27 17:19:08.841913 kernel: raid6: int64x2 gen() 3597 MB/s May 27 17:19:08.859747 kernel: raid6: int64x1 gen() 2767 MB/s May 27 17:19:08.859792 kernel: raid6: using algorithm neonx4 gen() 6530 MB/s May 27 17:19:08.877732 kernel: raid6: .... xor() 4918 MB/s, rmw enabled May 27 17:19:08.877767 kernel: raid6: using neon recovery algorithm May 27 17:19:08.884915 kernel: xor: measuring software checksum speed May 27 17:19:08.885913 kernel: 8regs : 11935 MB/sec May 27 17:19:08.888191 kernel: 32regs : 11981 MB/sec May 27 17:19:08.888222 kernel: arm64_neon : 8679 MB/sec May 27 17:19:08.888246 kernel: xor: using function: 32regs (11981 MB/sec) May 27 17:19:08.981189 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 17:19:08.991972 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 17:19:08.999476 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:19:09.048423 systemd-udevd[508]: Using default interface naming scheme 'v255'. May 27 17:19:09.060175 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:19:09.066179 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 17:19:09.100989 dracut-pre-trigger[510]: rd.md=0: removing MD RAID activation May 27 17:19:09.144447 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:19:09.150349 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:19:09.299920 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:19:09.323362 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 17:19:09.462293 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 May 27 17:19:09.462376 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) May 27 17:19:09.469143 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 May 27 17:19:09.469215 kernel: ena 0000:00:05.0: ENA device version: 0.10 May 27 17:19:09.469529 kernel: nvme nvme0: pci function 0000:00:04.0 May 27 17:19:09.469759 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 May 27 17:19:09.476927 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:db:94:f8:69:d5 May 27 17:19:09.481919 kernel: nvme nvme0: 2/0/0 default/read/poll queues May 27 17:19:09.490920 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 17:19:09.490981 kernel: GPT:9289727 != 16777215 May 27 17:19:09.491006 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 17:19:09.493604 kernel: GPT:9289727 != 16777215 May 27 17:19:09.493658 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 17:19:09.495454 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 17:19:09.507269 (udev-worker)[578]: Network interface NamePolicy= disabled on kernel command line. May 27 17:19:09.516499 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:19:09.516754 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:19:09.522721 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:19:09.532174 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:19:09.539673 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 17:19:09.571893 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:19:09.597936 kernel: nvme nvme0: using unchecked data buffer May 27 17:19:09.698079 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. May 27 17:19:09.800689 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. May 27 17:19:09.823521 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. May 27 17:19:09.823957 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. May 27 17:19:09.825953 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 17:19:09.848951 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. May 27 17:19:09.857939 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:19:09.865058 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:19:09.867474 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:19:09.874490 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 17:19:09.884476 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 17:19:09.910268 disk-uuid[686]: Primary Header is updated. May 27 17:19:09.910268 disk-uuid[686]: Secondary Entries is updated. May 27 17:19:09.910268 disk-uuid[686]: Secondary Header is updated. May 27 17:19:09.923269 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 17:19:09.930915 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 17:19:09.932154 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 17:19:10.934311 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 17:19:10.935976 disk-uuid[688]: The operation has completed successfully. May 27 17:19:11.124972 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 17:19:11.126943 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 17:19:11.202477 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 17:19:11.233466 sh[952]: Success May 27 17:19:11.254991 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 17:19:11.255076 kernel: device-mapper: uevent: version 1.0.3 May 27 17:19:11.256842 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 17:19:11.270925 kernel: device-mapper: verity: sha256 using shash "sha256-ce" May 27 17:19:11.367436 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 17:19:11.370810 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 17:19:11.396659 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 17:19:11.417938 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 17:19:11.420976 kernel: BTRFS: device fsid 3c8c76ef-f1da-40fe-979d-11bdf765e403 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (976) May 27 17:19:11.422907 kernel: BTRFS info (device dm-0): first mount of filesystem 3c8c76ef-f1da-40fe-979d-11bdf765e403 May 27 17:19:11.422947 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 27 17:19:11.425450 kernel: BTRFS info (device dm-0): using free-space-tree May 27 17:19:11.689201 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 17:19:11.693017 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 17:19:11.697483 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 17:19:11.702290 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 17:19:11.708128 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 17:19:11.762956 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1011) May 27 17:19:11.767741 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:19:11.767810 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 17:19:11.767836 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 17:19:11.788019 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:19:11.790613 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 17:19:11.797179 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 17:19:11.879867 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:19:11.887169 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:19:11.953227 systemd-networkd[1145]: lo: Link UP May 27 17:19:11.953239 systemd-networkd[1145]: lo: Gained carrier May 27 17:19:11.958092 systemd-networkd[1145]: Enumeration completed May 27 17:19:11.959202 systemd-networkd[1145]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:19:11.959209 systemd-networkd[1145]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:19:11.959967 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:19:11.963651 systemd[1]: Reached target network.target - Network. May 27 17:19:11.974681 systemd-networkd[1145]: eth0: Link UP May 27 17:19:11.974695 systemd-networkd[1145]: eth0: Gained carrier May 27 17:19:11.974718 systemd-networkd[1145]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:19:11.994991 systemd-networkd[1145]: eth0: DHCPv4 address 172.31.16.30/20, gateway 172.31.16.1 acquired from 172.31.16.1 May 27 17:19:12.394282 ignition[1072]: Ignition 2.21.0 May 27 17:19:12.394304 ignition[1072]: Stage: fetch-offline May 27 17:19:12.394681 ignition[1072]: no configs at "/usr/lib/ignition/base.d" May 27 17:19:12.394703 ignition[1072]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 17:19:12.395673 ignition[1072]: Ignition finished successfully May 27 17:19:12.405558 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:19:12.411396 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 17:19:12.449084 ignition[1158]: Ignition 2.21.0 May 27 17:19:12.449115 ignition[1158]: Stage: fetch May 27 17:19:12.450616 ignition[1158]: no configs at "/usr/lib/ignition/base.d" May 27 17:19:12.450643 ignition[1158]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 17:19:12.451201 ignition[1158]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 17:19:12.464005 ignition[1158]: PUT result: OK May 27 17:19:12.468327 ignition[1158]: parsed url from cmdline: "" May 27 17:19:12.468371 ignition[1158]: no config URL provided May 27 17:19:12.468395 ignition[1158]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:19:12.468444 ignition[1158]: no config at "/usr/lib/ignition/user.ign" May 27 17:19:12.468486 ignition[1158]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 17:19:12.479201 ignition[1158]: PUT result: OK May 27 17:19:12.481227 ignition[1158]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 May 27 17:19:12.484088 ignition[1158]: GET result: OK May 27 17:19:12.485547 ignition[1158]: parsing config with SHA512: 1f3ba3b22ed371fe1d5b5e91c563cacc55315f428281646d0e5363898a8dfa2aeaabdf7fcbb37437b35d86a784cb17866d89c0195fed669be6fd4f1850e8579c May 27 17:19:12.493003 unknown[1158]: fetched base config from "system" May 27 17:19:12.493436 unknown[1158]: fetched base config from "system" May 27 17:19:12.494059 ignition[1158]: fetch: fetch complete May 27 17:19:12.493450 unknown[1158]: fetched user config from "aws" May 27 17:19:12.494071 ignition[1158]: fetch: fetch passed May 27 17:19:12.494176 ignition[1158]: Ignition finished successfully May 27 17:19:12.505270 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 17:19:12.511109 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 17:19:12.548939 ignition[1165]: Ignition 2.21.0 May 27 17:19:12.549459 ignition[1165]: Stage: kargs May 27 17:19:12.550000 ignition[1165]: no configs at "/usr/lib/ignition/base.d" May 27 17:19:12.550023 ignition[1165]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 17:19:12.550185 ignition[1165]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 17:19:12.553103 ignition[1165]: PUT result: OK May 27 17:19:12.563793 ignition[1165]: kargs: kargs passed May 27 17:19:12.565153 ignition[1165]: Ignition finished successfully May 27 17:19:12.569979 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 17:19:12.575236 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 17:19:12.620175 ignition[1172]: Ignition 2.21.0 May 27 17:19:12.620654 ignition[1172]: Stage: disks May 27 17:19:12.621218 ignition[1172]: no configs at "/usr/lib/ignition/base.d" May 27 17:19:12.621241 ignition[1172]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 17:19:12.621418 ignition[1172]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 17:19:12.625413 ignition[1172]: PUT result: OK May 27 17:19:12.633663 ignition[1172]: disks: disks passed May 27 17:19:12.633776 ignition[1172]: Ignition finished successfully May 27 17:19:12.638672 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 17:19:12.642832 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 17:19:12.647083 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 17:19:12.649325 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:19:12.653109 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:19:12.656555 systemd[1]: Reached target basic.target - Basic System. May 27 17:19:12.663096 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 17:19:12.731520 systemd-fsck[1180]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 27 17:19:12.739932 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 17:19:12.747463 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 17:19:12.902918 kernel: EXT4-fs (nvme0n1p9): mounted filesystem a5483afc-8426-4c3e-85ef-8146f9077e7d r/w with ordered data mode. Quota mode: none. May 27 17:19:12.904110 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 17:19:12.906641 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 17:19:12.911445 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:19:12.927005 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 17:19:12.931279 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 27 17:19:12.933759 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 17:19:12.933812 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:19:12.955921 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1199) May 27 17:19:12.959800 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:19:12.959839 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 17:19:12.959865 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 17:19:12.969786 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:19:12.971121 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 17:19:12.974671 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 17:19:13.466047 systemd-networkd[1145]: eth0: Gained IPv6LL May 27 17:19:13.552655 initrd-setup-root[1223]: cut: /sysroot/etc/passwd: No such file or directory May 27 17:19:13.562384 initrd-setup-root[1230]: cut: /sysroot/etc/group: No such file or directory May 27 17:19:13.570298 initrd-setup-root[1237]: cut: /sysroot/etc/shadow: No such file or directory May 27 17:19:13.577974 initrd-setup-root[1244]: cut: /sysroot/etc/gshadow: No such file or directory May 27 17:19:14.018227 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 17:19:14.024234 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 17:19:14.028534 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 17:19:14.054114 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 17:19:14.056576 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:19:14.085978 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 17:19:14.102918 ignition[1312]: INFO : Ignition 2.21.0 May 27 17:19:14.102918 ignition[1312]: INFO : Stage: mount May 27 17:19:14.106450 ignition[1312]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:19:14.106450 ignition[1312]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 17:19:14.106450 ignition[1312]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 17:19:14.113334 ignition[1312]: INFO : PUT result: OK May 27 17:19:14.117064 ignition[1312]: INFO : mount: mount passed May 27 17:19:14.118743 ignition[1312]: INFO : Ignition finished successfully May 27 17:19:14.123940 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 17:19:14.129584 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 17:19:14.160502 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:19:14.205920 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1324) May 27 17:19:14.210108 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 17:19:14.210163 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 17:19:14.211347 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 17:19:14.220154 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:19:14.259768 ignition[1341]: INFO : Ignition 2.21.0 May 27 17:19:14.259768 ignition[1341]: INFO : Stage: files May 27 17:19:14.263028 ignition[1341]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:19:14.263028 ignition[1341]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 17:19:14.263028 ignition[1341]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 17:19:14.270263 ignition[1341]: INFO : PUT result: OK May 27 17:19:14.275223 ignition[1341]: DEBUG : files: compiled without relabeling support, skipping May 27 17:19:14.278921 ignition[1341]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 17:19:14.278921 ignition[1341]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 17:19:14.288238 ignition[1341]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 17:19:14.291262 ignition[1341]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 17:19:14.294119 unknown[1341]: wrote ssh authorized keys file for user: core May 27 17:19:14.296517 ignition[1341]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 17:19:14.302907 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" May 27 17:19:14.302907 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 May 27 17:19:14.389666 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 17:19:14.910557 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" May 27 17:19:14.910557 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 17:19:14.917374 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 17:19:14.917374 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 17:19:14.917374 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 17:19:14.917374 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:19:14.917374 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:19:14.917374 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:19:14.917374 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:19:14.940119 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:19:14.940119 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:19:14.940119 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:19:14.953522 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:19:14.953522 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:19:14.953522 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 May 27 17:19:15.688266 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 17:19:16.058247 ignition[1341]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" May 27 17:19:16.058247 ignition[1341]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 17:19:16.064743 ignition[1341]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:19:16.072702 ignition[1341]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:19:16.072702 ignition[1341]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 17:19:16.072702 ignition[1341]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 17:19:16.072702 ignition[1341]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 17:19:16.072702 ignition[1341]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 17:19:16.072702 ignition[1341]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 17:19:16.072702 ignition[1341]: INFO : files: files passed May 27 17:19:16.072702 ignition[1341]: INFO : Ignition finished successfully May 27 17:19:16.075866 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 17:19:16.097434 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 17:19:16.102579 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 17:19:16.126317 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 17:19:16.126561 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 17:19:16.139725 initrd-setup-root-after-ignition[1370]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:19:16.143060 initrd-setup-root-after-ignition[1370]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 17:19:16.146746 initrd-setup-root-after-ignition[1374]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:19:16.149752 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:19:16.157463 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 17:19:16.160718 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 17:19:16.241663 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 17:19:16.242870 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 17:19:16.247287 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 17:19:16.252205 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 17:19:16.254312 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 17:19:16.262494 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 17:19:16.300190 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:19:16.307108 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 17:19:16.361285 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 17:19:16.365948 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:19:16.368076 systemd[1]: Stopped target timers.target - Timer Units. May 27 17:19:16.368400 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 17:19:16.368702 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:19:16.369928 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 17:19:16.370610 systemd[1]: Stopped target basic.target - Basic System. May 27 17:19:16.370943 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 17:19:16.371550 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:19:16.371860 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 17:19:16.372466 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 17:19:16.372790 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 17:19:16.373105 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:19:16.373726 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 17:19:16.374346 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 17:19:16.374960 systemd[1]: Stopped target swap.target - Swaps. May 27 17:19:16.375145 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 17:19:16.375353 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 17:19:16.376396 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 17:19:16.376815 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:19:16.377318 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 17:19:16.397388 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:19:16.401944 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 17:19:16.402526 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 17:19:16.410006 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 17:19:16.410538 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:19:16.413950 systemd[1]: ignition-files.service: Deactivated successfully. May 27 17:19:16.414425 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 17:19:16.421770 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 17:19:16.437465 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 17:19:16.449095 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 17:19:16.449461 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:19:16.471460 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 17:19:16.471687 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:19:16.505988 ignition[1394]: INFO : Ignition 2.21.0 May 27 17:19:16.508036 ignition[1394]: INFO : Stage: umount May 27 17:19:16.509983 ignition[1394]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:19:16.509983 ignition[1394]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 17:19:16.509872 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 17:19:16.519177 ignition[1394]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 17:19:16.513351 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 17:19:16.525078 ignition[1394]: INFO : PUT result: OK May 27 17:19:16.536016 ignition[1394]: INFO : umount: umount passed May 27 17:19:16.537835 ignition[1394]: INFO : Ignition finished successfully May 27 17:19:16.542371 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 17:19:16.542712 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 17:19:16.549584 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 17:19:16.549677 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 17:19:16.552014 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 17:19:16.552105 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 17:19:16.554097 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 17:19:16.554193 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 17:19:16.556419 systemd[1]: Stopped target network.target - Network. May 27 17:19:16.558615 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 17:19:16.558711 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:19:16.562811 systemd[1]: Stopped target paths.target - Path Units. May 27 17:19:16.565013 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 17:19:16.572867 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:19:16.580006 systemd[1]: Stopped target slices.target - Slice Units. May 27 17:19:16.584378 systemd[1]: Stopped target sockets.target - Socket Units. May 27 17:19:16.588183 systemd[1]: iscsid.socket: Deactivated successfully. May 27 17:19:16.588272 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:19:16.590371 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 17:19:16.590444 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:19:16.592844 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 17:19:16.592977 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 17:19:16.594988 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 17:19:16.619280 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 17:19:16.621546 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 17:19:16.624065 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 17:19:16.629677 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 17:19:16.630904 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 17:19:16.631103 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 17:19:16.636655 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 17:19:16.636808 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 17:19:16.652179 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 17:19:16.652574 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 17:19:16.661145 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 17:19:16.661707 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 17:19:16.661940 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 17:19:16.673255 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 17:19:16.676140 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 17:19:16.678517 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 17:19:16.680324 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 17:19:16.683938 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 17:19:16.685991 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 17:19:16.686101 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:19:16.688609 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 17:19:16.688720 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 17:19:16.692744 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 17:19:16.692839 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 17:19:16.698304 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 17:19:16.698395 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:19:16.708427 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:19:16.726717 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 17:19:16.726845 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 17:19:16.740369 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 17:19:16.743095 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:19:16.747517 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 17:19:16.747617 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 17:19:16.757072 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 17:19:16.757146 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:19:16.759421 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 17:19:16.759509 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 17:19:16.772064 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 17:19:16.772175 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 17:19:16.777489 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 17:19:16.777598 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:19:16.787169 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 17:19:16.790913 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 17:19:16.791435 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:19:16.805089 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 17:19:16.805193 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:19:16.812295 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 17:19:16.812381 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:19:16.819806 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 17:19:16.819918 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:19:16.822770 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:19:16.822847 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:19:16.834200 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 17:19:16.834358 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 17:19:16.834443 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 17:19:16.834534 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 17:19:16.835377 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 17:19:16.844200 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 17:19:16.855512 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 17:19:16.855688 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 17:19:16.864375 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 17:19:16.872825 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 17:19:16.904646 systemd[1]: Switching root. May 27 17:19:17.041181 systemd-journald[257]: Journal stopped May 27 17:19:19.052168 systemd-journald[257]: Received SIGTERM from PID 1 (systemd). May 27 17:19:19.052295 kernel: SELinux: policy capability network_peer_controls=1 May 27 17:19:19.052336 kernel: SELinux: policy capability open_perms=1 May 27 17:19:19.052366 kernel: SELinux: policy capability extended_socket_class=1 May 27 17:19:19.052402 kernel: SELinux: policy capability always_check_network=0 May 27 17:19:19.052431 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 17:19:19.052466 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 17:19:19.052495 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 17:19:19.052523 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 17:19:19.052558 kernel: SELinux: policy capability userspace_initial_context=0 May 27 17:19:19.052585 kernel: audit: type=1403 audit(1748366357.376:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 17:19:19.052623 systemd[1]: Successfully loaded SELinux policy in 61.674ms. May 27 17:19:19.052670 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 23.658ms. May 27 17:19:19.052703 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:19:19.052734 systemd[1]: Detected virtualization amazon. May 27 17:19:19.052761 systemd[1]: Detected architecture arm64. May 27 17:19:19.052788 systemd[1]: Detected first boot. May 27 17:19:19.052823 systemd[1]: Initializing machine ID from VM UUID. May 27 17:19:19.052853 zram_generator::config[1438]: No configuration found. May 27 17:19:19.052906 kernel: NET: Registered PF_VSOCK protocol family May 27 17:19:19.052942 systemd[1]: Populated /etc with preset unit settings. May 27 17:19:19.052976 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 17:19:19.053007 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 17:19:19.053036 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 17:19:19.053064 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 17:19:19.053093 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 17:19:19.053127 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 17:19:19.053160 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 17:19:19.053190 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 17:19:19.053217 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 17:19:19.053245 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 17:19:19.053275 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 17:19:19.053302 systemd[1]: Created slice user.slice - User and Session Slice. May 27 17:19:19.053334 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:19:19.053368 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:19:19.053396 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 17:19:19.053425 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 17:19:19.053453 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 17:19:19.053485 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:19:19.053514 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 17:19:19.053542 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:19:19.053572 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:19:19.053604 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 17:19:19.053633 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 17:19:19.053660 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 17:19:19.053688 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 17:19:19.053715 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:19:19.053745 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:19:19.053776 systemd[1]: Reached target slices.target - Slice Units. May 27 17:19:19.053814 systemd[1]: Reached target swap.target - Swaps. May 27 17:19:19.053842 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 17:19:19.053873 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 17:19:19.055986 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 17:19:19.056029 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:19:19.056058 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:19:19.056451 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:19:19.056803 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 17:19:19.056877 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 17:19:19.056962 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 17:19:19.056993 systemd[1]: Mounting media.mount - External Media Directory... May 27 17:19:19.057030 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 17:19:19.057058 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 17:19:19.057086 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 17:19:19.057114 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 17:19:19.057142 systemd[1]: Reached target machines.target - Containers. May 27 17:19:19.057172 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 17:19:19.057202 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:19:19.057229 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:19:19.057257 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 17:19:19.057288 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:19:19.057316 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:19:19.057345 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:19:19.057372 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 17:19:19.057403 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:19:19.057434 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 17:19:19.057462 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 17:19:19.057490 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 17:19:19.057522 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 17:19:19.057550 systemd[1]: Stopped systemd-fsck-usr.service. May 27 17:19:19.057579 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:19:19.057608 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:19:19.057636 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:19:19.057665 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:19:19.057695 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 17:19:19.057724 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 17:19:19.057752 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:19:19.057786 systemd[1]: verity-setup.service: Deactivated successfully. May 27 17:19:19.057817 systemd[1]: Stopped verity-setup.service. May 27 17:19:19.057848 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 17:19:19.059919 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 17:19:19.059965 systemd[1]: Mounted media.mount - External Media Directory. May 27 17:19:19.059998 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 17:19:19.060026 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 17:19:19.060053 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 17:19:19.060080 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:19:19.060110 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 17:19:19.060146 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 17:19:19.060174 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:19:19.060202 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:19:19.060232 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:19:19.060262 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:19:19.060293 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:19:19.060331 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 17:19:19.060361 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:19:19.060389 kernel: loop: module loaded May 27 17:19:19.060422 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:19:19.060450 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:19:19.060478 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:19:19.060506 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 17:19:19.060534 kernel: fuse: init (API version 7.41) May 27 17:19:19.060560 kernel: ACPI: bus type drm_connector registered May 27 17:19:19.060586 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:19:19.060662 systemd-journald[1521]: Collecting audit messages is disabled. May 27 17:19:19.060715 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:19:19.060746 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 17:19:19.060774 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 17:19:19.060802 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:19:19.060832 systemd-journald[1521]: Journal started May 27 17:19:19.060900 systemd-journald[1521]: Runtime Journal (/run/log/journal/ec287a0179ec021a8cbbc0d9bd35d2d0) is 8M, max 75.3M, 67.3M free. May 27 17:19:18.406179 systemd[1]: Queued start job for default target multi-user.target. May 27 17:19:18.423082 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 27 17:19:18.423866 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 17:19:19.069006 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:19:19.069230 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 17:19:19.072604 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 17:19:19.083730 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 17:19:19.116839 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:19:19.120236 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 17:19:19.120297 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:19:19.137461 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 17:19:19.143396 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 17:19:19.146282 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:19:19.150249 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 17:19:19.159118 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 17:19:19.163208 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:19:19.168323 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 17:19:19.170582 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:19:19.171962 systemd-tmpfiles[1541]: ACLs are not supported, ignoring. May 27 17:19:19.171992 systemd-tmpfiles[1541]: ACLs are not supported, ignoring. May 27 17:19:19.175110 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 17:19:19.183779 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:19:19.211027 systemd-journald[1521]: Time spent on flushing to /var/log/journal/ec287a0179ec021a8cbbc0d9bd35d2d0 is 68.301ms for 931 entries. May 27 17:19:19.211027 systemd-journald[1521]: System Journal (/var/log/journal/ec287a0179ec021a8cbbc0d9bd35d2d0) is 8M, max 195.6M, 187.6M free. May 27 17:19:19.337637 systemd-journald[1521]: Received client request to flush runtime journal. May 27 17:19:19.337906 kernel: loop0: detected capacity change from 0 to 61240 May 27 17:19:19.221225 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:19:19.228084 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 17:19:19.231271 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 17:19:19.233759 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 17:19:19.254330 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 17:19:19.282000 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:19:19.349002 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 17:19:19.383629 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 17:19:19.399018 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 17:19:19.404118 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:19:19.429167 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 17:19:19.434568 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 17:19:19.463214 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 17:19:19.489519 systemd-tmpfiles[1590]: ACLs are not supported, ignoring. May 27 17:19:19.492563 systemd-tmpfiles[1590]: ACLs are not supported, ignoring. May 27 17:19:19.505931 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 17:19:19.513954 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:19:19.542929 kernel: loop1: detected capacity change from 0 to 138376 May 27 17:19:19.688925 kernel: loop2: detected capacity change from 0 to 107312 May 27 17:19:19.740931 kernel: loop3: detected capacity change from 0 to 211168 May 27 17:19:19.942940 kernel: loop4: detected capacity change from 0 to 61240 May 27 17:19:19.959986 kernel: loop5: detected capacity change from 0 to 138376 May 27 17:19:19.978947 kernel: loop6: detected capacity change from 0 to 107312 May 27 17:19:19.981415 ldconfig[1572]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 17:19:19.988275 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 17:19:20.004005 kernel: loop7: detected capacity change from 0 to 211168 May 27 17:19:20.027213 (sd-merge)[1600]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. May 27 17:19:20.029952 (sd-merge)[1600]: Merged extensions into '/usr'. May 27 17:19:20.037532 systemd[1]: Reload requested from client PID 1577 ('systemd-sysext') (unit systemd-sysext.service)... May 27 17:19:20.037720 systemd[1]: Reloading... May 27 17:19:20.209925 zram_generator::config[1629]: No configuration found. May 27 17:19:20.423679 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:19:20.604187 systemd[1]: Reloading finished in 565 ms. May 27 17:19:20.647014 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 17:19:20.649896 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 17:19:20.664063 systemd[1]: Starting ensure-sysext.service... May 27 17:19:20.671156 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:19:20.685172 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:19:20.710466 systemd[1]: Reload requested from client PID 1678 ('systemctl') (unit ensure-sysext.service)... May 27 17:19:20.710495 systemd[1]: Reloading... May 27 17:19:20.745312 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 17:19:20.745382 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 17:19:20.747037 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 17:19:20.747624 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 17:19:20.749672 systemd-tmpfiles[1679]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 17:19:20.750833 systemd-tmpfiles[1679]: ACLs are not supported, ignoring. May 27 17:19:20.751173 systemd-tmpfiles[1679]: ACLs are not supported, ignoring. May 27 17:19:20.759712 systemd-tmpfiles[1679]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:19:20.759940 systemd-tmpfiles[1679]: Skipping /boot May 27 17:19:20.789944 systemd-tmpfiles[1679]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:19:20.790160 systemd-tmpfiles[1679]: Skipping /boot May 27 17:19:20.844545 systemd-udevd[1680]: Using default interface naming scheme 'v255'. May 27 17:19:20.899967 zram_generator::config[1713]: No configuration found. May 27 17:19:21.105249 (udev-worker)[1780]: Network interface NamePolicy= disabled on kernel command line. May 27 17:19:21.285968 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:19:21.570517 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 17:19:21.571176 systemd[1]: Reloading finished in 860 ms. May 27 17:19:21.602843 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:19:21.663127 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:19:21.707015 systemd[1]: Finished ensure-sysext.service. May 27 17:19:21.754430 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:19:21.760036 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 17:19:21.762638 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:19:21.764577 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:19:21.779262 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:19:21.812245 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:19:21.817214 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:19:21.819487 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:19:21.819568 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:19:21.822735 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 17:19:21.830263 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:19:21.837242 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:19:21.839342 systemd[1]: Reached target time-set.target - System Time Set. May 27 17:19:21.850265 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 17:19:21.884635 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:19:21.885196 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:19:21.888530 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:19:21.889067 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:19:21.973639 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 17:19:21.980734 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:19:21.981160 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:19:21.986641 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:19:22.007536 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:19:22.008073 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:19:22.022436 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 17:19:22.053111 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 17:19:22.056805 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:19:22.062093 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 17:19:22.067004 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 17:19:22.071050 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 17:19:22.079479 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:19:22.126822 augenrules[1934]: No rules May 27 17:19:22.130578 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:19:22.131523 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:19:22.147578 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 17:19:22.180541 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. May 27 17:19:22.185763 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 17:19:22.242705 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 17:19:22.284004 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 17:19:22.287992 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:19:22.418147 systemd-networkd[1901]: lo: Link UP May 27 17:19:22.418166 systemd-networkd[1901]: lo: Gained carrier May 27 17:19:22.421088 systemd-networkd[1901]: Enumeration completed May 27 17:19:22.421291 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:19:22.423860 systemd-networkd[1901]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:19:22.423904 systemd-networkd[1901]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:19:22.426125 systemd-networkd[1901]: eth0: Link UP May 27 17:19:22.426655 systemd-networkd[1901]: eth0: Gained carrier May 27 17:19:22.426817 systemd-networkd[1901]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:19:22.427758 systemd-resolved[1902]: Positive Trust Anchors: May 27 17:19:22.427795 systemd-resolved[1902]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:19:22.427859 systemd-resolved[1902]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:19:22.430772 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 17:19:22.437194 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 17:19:22.443023 systemd-networkd[1901]: eth0: DHCPv4 address 172.31.16.30/20, gateway 172.31.16.1 acquired from 172.31.16.1 May 27 17:19:22.447356 systemd-resolved[1902]: Defaulting to hostname 'linux'. May 27 17:19:22.452483 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:19:22.454853 systemd[1]: Reached target network.target - Network. May 27 17:19:22.456662 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:19:22.459023 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:19:22.461169 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 17:19:22.463561 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 17:19:22.466706 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 17:19:22.469265 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 17:19:22.472154 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 17:19:22.474561 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 17:19:22.474615 systemd[1]: Reached target paths.target - Path Units. May 27 17:19:22.476354 systemd[1]: Reached target timers.target - Timer Units. May 27 17:19:22.479158 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 17:19:22.484537 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 17:19:22.491657 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 17:19:22.495396 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 17:19:22.498062 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 17:19:22.510972 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 17:19:22.513735 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 17:19:22.518950 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 17:19:22.521872 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 17:19:22.525259 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:19:22.527354 systemd[1]: Reached target basic.target - Basic System. May 27 17:19:22.529328 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 17:19:22.529378 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 17:19:22.534066 systemd[1]: Starting containerd.service - containerd container runtime... May 27 17:19:22.538650 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 17:19:22.546276 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 17:19:22.555190 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 17:19:22.560606 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 17:19:22.565791 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 17:19:22.568078 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 17:19:22.572341 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 17:19:22.582313 systemd[1]: Started ntpd.service - Network Time Service. May 27 17:19:22.588315 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 17:19:22.599255 systemd[1]: Starting setup-oem.service - Setup OEM... May 27 17:19:22.617060 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 17:19:22.627334 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 17:19:22.650917 jq[1966]: false May 27 17:19:22.652242 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 17:19:22.663724 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 17:19:22.664667 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 17:19:22.677226 systemd[1]: Starting update-engine.service - Update Engine... May 27 17:19:22.689401 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 17:19:22.701806 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 17:19:22.704851 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 17:19:22.707376 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 17:19:22.714760 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 17:19:22.715230 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 17:19:22.799757 extend-filesystems[1967]: Found loop4 May 27 17:19:22.805120 extend-filesystems[1967]: Found loop5 May 27 17:19:22.805120 extend-filesystems[1967]: Found loop6 May 27 17:19:22.805120 extend-filesystems[1967]: Found loop7 May 27 17:19:22.805120 extend-filesystems[1967]: Found nvme0n1 May 27 17:19:22.805120 extend-filesystems[1967]: Found nvme0n1p1 May 27 17:19:22.805120 extend-filesystems[1967]: Found nvme0n1p2 May 27 17:19:22.805120 extend-filesystems[1967]: Found nvme0n1p3 May 27 17:19:22.836365 extend-filesystems[1967]: Found usr May 27 17:19:22.836365 extend-filesystems[1967]: Found nvme0n1p4 May 27 17:19:22.836365 extend-filesystems[1967]: Found nvme0n1p6 May 27 17:19:22.836365 extend-filesystems[1967]: Found nvme0n1p7 May 27 17:19:22.836365 extend-filesystems[1967]: Found nvme0n1p9 May 27 17:19:22.836365 extend-filesystems[1967]: Checking size of /dev/nvme0n1p9 May 27 17:19:22.866561 update_engine[1978]: I20250527 17:19:22.815763 1978 main.cc:92] Flatcar Update Engine starting May 27 17:19:22.869035 jq[1979]: true May 27 17:19:22.837572 systemd[1]: motdgen.service: Deactivated successfully. May 27 17:19:22.869342 extend-filesystems[1967]: Resized partition /dev/nvme0n1p9 May 27 17:19:22.845579 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 17:19:22.875835 tar[1983]: linux-arm64/LICENSE May 27 17:19:22.875835 tar[1983]: linux-arm64/helm May 27 17:19:22.884108 extend-filesystems[2012]: resize2fs 1.47.2 (1-Jan-2025) May 27 17:19:22.906718 (ntainerd)[2002]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 17:19:22.907960 ntpd[1969]: ntpd 4.2.8p17@1.4004-o Tue May 27 14:54:38 UTC 2025 (1): Starting May 27 17:19:22.909278 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: ntpd 4.2.8p17@1.4004-o Tue May 27 14:54:38 UTC 2025 (1): Starting May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: ---------------------------------------------------- May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: ntp-4 is maintained by Network Time Foundation, May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Inc. (NTF), a non-profit 501(c)(3) public-benefit May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: corporation. Support and training for ntp-4 are May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: available at https://www.nwtime.org/support May 27 17:19:22.909322 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: ---------------------------------------------------- May 27 17:19:22.908018 ntpd[1969]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp May 27 17:19:22.908036 ntpd[1969]: ---------------------------------------------------- May 27 17:19:22.908053 ntpd[1969]: ntp-4 is maintained by Network Time Foundation, May 27 17:19:22.908070 ntpd[1969]: Inc. (NTF), a non-profit 501(c)(3) public-benefit May 27 17:19:22.924075 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: proto: precision = 0.096 usec (-23) May 27 17:19:22.908087 ntpd[1969]: corporation. Support and training for ntp-4 are May 27 17:19:22.908103 ntpd[1969]: available at https://www.nwtime.org/support May 27 17:19:22.908120 ntpd[1969]: ---------------------------------------------------- May 27 17:19:22.920846 ntpd[1969]: proto: precision = 0.096 usec (-23) May 27 17:19:22.927809 jq[2011]: true May 27 17:19:22.930306 ntpd[1969]: basedate set to 2025-05-15 May 27 17:19:22.931566 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: basedate set to 2025-05-15 May 27 17:19:22.931566 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: gps base set to 2025-05-18 (week 2367) May 27 17:19:22.930729 ntpd[1969]: gps base set to 2025-05-18 (week 2367) May 27 17:19:22.933827 dbus-daemon[1964]: [system] SELinux support is enabled May 27 17:19:22.942174 ntpd[1969]: Listen and drop on 0 v6wildcard [::]:123 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Listen and drop on 0 v6wildcard [::]:123 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Listen and drop on 1 v4wildcard 0.0.0.0:123 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Listen normally on 2 lo 127.0.0.1:123 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Listen normally on 3 eth0 172.31.16.30:123 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Listen normally on 4 lo [::1]:123 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: bind(21) AF_INET6 fe80::4db:94ff:fef8:69d5%2#123 flags 0x11 failed: Cannot assign requested address May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: unable to create socket on eth0 (5) for fe80::4db:94ff:fef8:69d5%2#123 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: failed to init interface for address fe80::4db:94ff:fef8:69d5%2 May 27 17:19:22.943044 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: Listening on routing socket on fd #21 for interface updates May 27 17:19:22.942264 ntpd[1969]: Listen and drop on 1 v4wildcard 0.0.0.0:123 May 27 17:19:22.942516 ntpd[1969]: Listen normally on 2 lo 127.0.0.1:123 May 27 17:19:22.942574 ntpd[1969]: Listen normally on 3 eth0 172.31.16.30:123 May 27 17:19:22.942637 ntpd[1969]: Listen normally on 4 lo [::1]:123 May 27 17:19:22.942703 ntpd[1969]: bind(21) AF_INET6 fe80::4db:94ff:fef8:69d5%2#123 flags 0x11 failed: Cannot assign requested address May 27 17:19:22.942739 ntpd[1969]: unable to create socket on eth0 (5) for fe80::4db:94ff:fef8:69d5%2#123 May 27 17:19:22.942764 ntpd[1969]: failed to init interface for address fe80::4db:94ff:fef8:69d5%2 May 27 17:19:22.942811 ntpd[1969]: Listening on routing socket on fd #21 for interface updates May 27 17:19:22.947144 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 17:19:22.953789 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 17:19:22.968130 coreos-metadata[1963]: May 27 17:19:22.954 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 May 27 17:19:22.968548 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 17:19:22.961827 ntpd[1969]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 17:19:22.953970 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 17:19:22.957150 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 17:19:22.957192 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 17:19:22.975248 coreos-metadata[1963]: May 27 17:19:22.970 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 May 27 17:19:22.976680 coreos-metadata[1963]: May 27 17:19:22.976 INFO Fetch successful May 27 17:19:22.976680 coreos-metadata[1963]: May 27 17:19:22.976 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 May 27 17:19:22.978109 dbus-daemon[1964]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1901 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") May 27 17:19:22.981022 coreos-metadata[1963]: May 27 17:19:22.980 INFO Fetch successful May 27 17:19:22.981022 coreos-metadata[1963]: May 27 17:19:22.980 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 May 27 17:19:22.994998 coreos-metadata[1963]: May 27 17:19:22.991 INFO Fetch successful May 27 17:19:22.994998 coreos-metadata[1963]: May 27 17:19:22.991 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 May 27 17:19:22.993866 dbus-daemon[1964]: [system] Successfully activated service 'org.freedesktop.systemd1' May 27 17:19:22.995802 ntpd[1969]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 17:19:23.000437 coreos-metadata[1963]: May 27 17:19:22.997 INFO Fetch successful May 27 17:19:23.000437 coreos-metadata[1963]: May 27 17:19:22.997 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 May 27 17:19:23.000542 ntpd[1969]: 27 May 17:19:22 ntpd[1969]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 17:19:23.002771 coreos-metadata[1963]: May 27 17:19:23.000 INFO Fetch failed with 404: resource not found May 27 17:19:23.002771 coreos-metadata[1963]: May 27 17:19:23.001 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 May 27 17:19:23.003262 coreos-metadata[1963]: May 27 17:19:23.003 INFO Fetch successful May 27 17:19:23.003656 coreos-metadata[1963]: May 27 17:19:23.003 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 May 27 17:19:23.009567 coreos-metadata[1963]: May 27 17:19:23.009 INFO Fetch successful May 27 17:19:23.013344 coreos-metadata[1963]: May 27 17:19:23.010 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 May 27 17:19:23.014022 coreos-metadata[1963]: May 27 17:19:23.013 INFO Fetch successful May 27 17:19:23.016814 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... May 27 17:19:23.021792 coreos-metadata[1963]: May 27 17:19:23.019 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 May 27 17:19:23.023805 systemd[1]: Started update-engine.service - Update Engine. May 27 17:19:23.029266 coreos-metadata[1963]: May 27 17:19:23.023 INFO Fetch successful May 27 17:19:23.029266 coreos-metadata[1963]: May 27 17:19:23.028 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 May 27 17:19:23.031528 update_engine[1978]: I20250527 17:19:23.029829 1978 update_check_scheduler.cc:74] Next update check in 2m43s May 27 17:19:23.039628 coreos-metadata[1963]: May 27 17:19:23.036 INFO Fetch successful May 27 17:19:23.054928 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 May 27 17:19:23.075288 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 17:19:23.081169 systemd[1]: Finished setup-oem.service - Setup OEM. May 27 17:19:23.087720 extend-filesystems[2012]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required May 27 17:19:23.087720 extend-filesystems[2012]: old_desc_blocks = 1, new_desc_blocks = 1 May 27 17:19:23.087720 extend-filesystems[2012]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. May 27 17:19:23.099277 extend-filesystems[1967]: Resized filesystem in /dev/nvme0n1p9 May 27 17:19:23.122679 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 17:19:23.123221 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 17:19:23.132380 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 17:19:23.239150 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 17:19:23.241730 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 17:19:23.277343 bash[2048]: Updated "/home/core/.ssh/authorized_keys" May 27 17:19:23.286392 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 17:19:23.298072 systemd[1]: Starting sshkeys.service... May 27 17:19:23.327329 systemd-logind[1976]: Watching system buttons on /dev/input/event0 (Power Button) May 27 17:19:23.327382 systemd-logind[1976]: Watching system buttons on /dev/input/event1 (Sleep Button) May 27 17:19:23.328322 systemd-logind[1976]: New seat seat0. May 27 17:19:23.336109 systemd[1]: Started systemd-logind.service - User Login Management. May 27 17:19:23.437098 containerd[2002]: time="2025-05-27T17:19:23Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 17:19:23.438415 containerd[2002]: time="2025-05-27T17:19:23.438363839Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 17:19:23.464637 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 17:19:23.471409 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 17:19:23.495794 containerd[2002]: time="2025-05-27T17:19:23.495664439Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="13.248µs" May 27 17:19:23.498921 containerd[2002]: time="2025-05-27T17:19:23.496939895Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 17:19:23.498921 containerd[2002]: time="2025-05-27T17:19:23.498188891Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 17:19:23.498921 containerd[2002]: time="2025-05-27T17:19:23.498501551Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 17:19:23.498921 containerd[2002]: time="2025-05-27T17:19:23.498538223Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 17:19:23.498921 containerd[2002]: time="2025-05-27T17:19:23.498590363Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:19:23.498921 containerd[2002]: time="2025-05-27T17:19:23.498699191Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:19:23.498921 containerd[2002]: time="2025-05-27T17:19:23.498725603Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.510208595Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.510266999Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.510301535Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.510324167Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.510545723Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.510951719Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.511013303Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:19:23.511874 containerd[2002]: time="2025-05-27T17:19:23.511037603Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 17:19:23.522690 containerd[2002]: time="2025-05-27T17:19:23.520991375Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 17:19:23.522690 containerd[2002]: time="2025-05-27T17:19:23.522251339Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 17:19:23.522690 containerd[2002]: time="2025-05-27T17:19:23.522439115Z" level=info msg="metadata content store policy set" policy=shared May 27 17:19:23.533080 containerd[2002]: time="2025-05-27T17:19:23.533015339Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 17:19:23.533197 containerd[2002]: time="2025-05-27T17:19:23.533116487Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 17:19:23.533197 containerd[2002]: time="2025-05-27T17:19:23.533154443Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 17:19:23.533197 containerd[2002]: time="2025-05-27T17:19:23.533183579Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 17:19:23.533351 containerd[2002]: time="2025-05-27T17:19:23.533218391Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 17:19:23.533351 containerd[2002]: time="2025-05-27T17:19:23.533247695Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 17:19:23.533351 containerd[2002]: time="2025-05-27T17:19:23.533275607Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 17:19:23.533351 containerd[2002]: time="2025-05-27T17:19:23.533317139Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 17:19:23.533510 containerd[2002]: time="2025-05-27T17:19:23.533348843Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 17:19:23.533510 containerd[2002]: time="2025-05-27T17:19:23.533375555Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 17:19:23.533510 containerd[2002]: time="2025-05-27T17:19:23.533399015Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 17:19:23.533510 containerd[2002]: time="2025-05-27T17:19:23.533430215Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 17:19:23.533660 containerd[2002]: time="2025-05-27T17:19:23.533642927Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 17:19:23.533704 containerd[2002]: time="2025-05-27T17:19:23.533680535Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 17:19:23.533748 containerd[2002]: time="2025-05-27T17:19:23.533719523Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 17:19:23.533801 containerd[2002]: time="2025-05-27T17:19:23.533746379Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 17:19:23.533801 containerd[2002]: time="2025-05-27T17:19:23.533772563Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 17:19:23.533901 containerd[2002]: time="2025-05-27T17:19:23.533797631Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 17:19:23.533901 containerd[2002]: time="2025-05-27T17:19:23.533826299Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 17:19:23.533901 containerd[2002]: time="2025-05-27T17:19:23.533851103Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 17:19:23.533901 containerd[2002]: time="2025-05-27T17:19:23.533909459Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 17:19:23.533901 containerd[2002]: time="2025-05-27T17:19:23.533942135Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 17:19:23.533901 containerd[2002]: time="2025-05-27T17:19:23.533971043Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 17:19:23.537054 containerd[2002]: time="2025-05-27T17:19:23.537009047Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 17:19:23.537126 containerd[2002]: time="2025-05-27T17:19:23.537059999Z" level=info msg="Start snapshots syncer" May 27 17:19:23.537192 containerd[2002]: time="2025-05-27T17:19:23.537119051Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 17:19:23.537567 containerd[2002]: time="2025-05-27T17:19:23.537499139Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 17:19:23.537746 containerd[2002]: time="2025-05-27T17:19:23.537594599Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 17:19:23.537799 containerd[2002]: time="2025-05-27T17:19:23.537738479Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 17:19:23.544183 containerd[2002]: time="2025-05-27T17:19:23.544119455Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 17:19:23.544314 containerd[2002]: time="2025-05-27T17:19:23.544204679Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 17:19:23.544314 containerd[2002]: time="2025-05-27T17:19:23.544235099Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 17:19:23.544314 containerd[2002]: time="2025-05-27T17:19:23.544264631Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 17:19:23.544314 containerd[2002]: time="2025-05-27T17:19:23.544294091Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 17:19:23.544483 containerd[2002]: time="2025-05-27T17:19:23.544321571Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 17:19:23.544483 containerd[2002]: time="2025-05-27T17:19:23.544349207Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 17:19:23.544483 containerd[2002]: time="2025-05-27T17:19:23.544402967Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 17:19:23.544483 containerd[2002]: time="2025-05-27T17:19:23.544430831Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 17:19:23.544483 containerd[2002]: time="2025-05-27T17:19:23.544459727Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544546151Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544581623Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544603523Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544629443Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544651163Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544675763Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544701947Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544754783Z" level=info msg="runtime interface created" May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544769735Z" level=info msg="created NRI interface" May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544790231Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 17:19:23.544840 containerd[2002]: time="2025-05-27T17:19:23.544820243Z" level=info msg="Connect containerd service" May 27 17:19:23.547994 containerd[2002]: time="2025-05-27T17:19:23.544874567Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 17:19:23.555605 containerd[2002]: time="2025-05-27T17:19:23.555519059Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 17:19:23.631154 systemd[1]: Started systemd-hostnamed.service - Hostname Service. May 27 17:19:23.639372 dbus-daemon[1964]: [system] Successfully activated service 'org.freedesktop.hostname1' May 27 17:19:23.648620 dbus-daemon[1964]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=2019 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") May 27 17:19:23.658429 systemd[1]: Starting polkit.service - Authorization Manager... May 27 17:19:23.749258 locksmithd[2020]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 17:19:23.806108 coreos-metadata[2070]: May 27 17:19:23.804 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 May 27 17:19:23.807407 coreos-metadata[2070]: May 27 17:19:23.807 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 May 27 17:19:23.810376 coreos-metadata[2070]: May 27 17:19:23.810 INFO Fetch successful May 27 17:19:23.810376 coreos-metadata[2070]: May 27 17:19:23.810 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 May 27 17:19:23.811427 coreos-metadata[2070]: May 27 17:19:23.811 INFO Fetch successful May 27 17:19:23.817389 unknown[2070]: wrote ssh authorized keys file for user: core May 27 17:19:23.908650 ntpd[1969]: bind(24) AF_INET6 fe80::4db:94ff:fef8:69d5%2#123 flags 0x11 failed: Cannot assign requested address May 27 17:19:23.909332 ntpd[1969]: 27 May 17:19:23 ntpd[1969]: bind(24) AF_INET6 fe80::4db:94ff:fef8:69d5%2#123 flags 0x11 failed: Cannot assign requested address May 27 17:19:23.909332 ntpd[1969]: 27 May 17:19:23 ntpd[1969]: unable to create socket on eth0 (6) for fe80::4db:94ff:fef8:69d5%2#123 May 27 17:19:23.909332 ntpd[1969]: 27 May 17:19:23 ntpd[1969]: failed to init interface for address fe80::4db:94ff:fef8:69d5%2 May 27 17:19:23.908709 ntpd[1969]: unable to create socket on eth0 (6) for fe80::4db:94ff:fef8:69d5%2#123 May 27 17:19:23.908735 ntpd[1969]: failed to init interface for address fe80::4db:94ff:fef8:69d5%2 May 27 17:19:23.919941 update-ssh-keys[2135]: Updated "/home/core/.ssh/authorized_keys" May 27 17:19:23.921293 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 17:19:23.928379 systemd[1]: Finished sshkeys.service. May 27 17:19:23.997309 sshd_keygen[2000]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105057190Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105228562Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105280162Z" level=info msg="Start subscribing containerd event" May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105371098Z" level=info msg="Start recovering state" May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105580438Z" level=info msg="Start event monitor" May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105614734Z" level=info msg="Start cni network conf syncer for default" May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105634870Z" level=info msg="Start streaming server" May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105677986Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105696022Z" level=info msg="runtime interface starting up..." May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105710974Z" level=info msg="starting plugins..." May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.105762094Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 17:19:24.112613 containerd[2002]: time="2025-05-27T17:19:24.108944470Z" level=info msg="containerd successfully booted in 0.670045s" May 27 17:19:24.106222 systemd[1]: Started containerd.service - containerd container runtime. May 27 17:19:24.188492 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 17:19:24.193849 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 17:19:24.197804 polkitd[2098]: Started polkitd version 126 May 27 17:19:24.201549 systemd[1]: Started sshd@0-172.31.16.30:22-139.178.68.195:39730.service - OpenSSH per-connection server daemon (139.178.68.195:39730). May 27 17:19:24.251167 polkitd[2098]: Loading rules from directory /etc/polkit-1/rules.d May 27 17:19:24.254707 polkitd[2098]: Loading rules from directory /run/polkit-1/rules.d May 27 17:19:24.254805 polkitd[2098]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) May 27 17:19:24.256262 polkitd[2098]: Loading rules from directory /usr/local/share/polkit-1/rules.d May 27 17:19:24.258325 polkitd[2098]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) May 27 17:19:24.258410 polkitd[2098]: Loading rules from directory /usr/share/polkit-1/rules.d May 27 17:19:24.262578 systemd[1]: issuegen.service: Deactivated successfully. May 27 17:19:24.263116 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 17:19:24.269279 polkitd[2098]: Finished loading, compiling and executing 2 rules May 27 17:19:24.272549 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 17:19:24.274982 systemd[1]: Started polkit.service - Authorization Manager. May 27 17:19:24.281932 dbus-daemon[1964]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' May 27 17:19:24.285499 polkitd[2098]: Acquired the name org.freedesktop.PolicyKit1 on the system bus May 27 17:19:24.355047 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 17:19:24.363561 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 17:19:24.369538 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 17:19:24.372136 systemd[1]: Reached target getty.target - Login Prompts. May 27 17:19:24.386983 systemd-hostnamed[2019]: Hostname set to (transient) May 27 17:19:24.387144 systemd-resolved[1902]: System hostname changed to 'ip-172-31-16-30'. May 27 17:19:24.410046 systemd-networkd[1901]: eth0: Gained IPv6LL May 27 17:19:24.417921 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 17:19:24.421578 systemd[1]: Reached target network-online.target - Network is Online. May 27 17:19:24.427287 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. May 27 17:19:24.434381 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:19:24.440086 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 17:19:24.518446 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 17:19:24.524156 sshd[2177]: Accepted publickey for core from 139.178.68.195 port 39730 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:24.531402 sshd-session[2177]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:24.550222 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 17:19:24.557692 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 17:19:24.612207 systemd-logind[1976]: New session 1 of user core. May 27 17:19:24.631296 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 17:19:24.642352 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 17:19:24.661282 (systemd)[2218]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 17:19:24.676557 systemd-logind[1976]: New session c1 of user core. May 27 17:19:24.699324 amazon-ssm-agent[2201]: Initializing new seelog logger May 27 17:19:24.700137 amazon-ssm-agent[2201]: New Seelog Logger Creation Complete May 27 17:19:24.700349 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.700920 amazon-ssm-agent[2201]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.701210 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 processing appconfig overrides May 27 17:19:24.701894 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.702016 amazon-ssm-agent[2201]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.702237 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 processing appconfig overrides May 27 17:19:24.702526 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.702615 amazon-ssm-agent[2201]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.702809 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 processing appconfig overrides May 27 17:19:24.703596 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.7017 INFO Proxy environment variables: May 27 17:19:24.707304 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.707304 amazon-ssm-agent[2201]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:24.707304 amazon-ssm-agent[2201]: 2025/05/27 17:19:24 processing appconfig overrides May 27 17:19:24.803778 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.7018 INFO no_proxy: May 27 17:19:24.904213 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.7018 INFO https_proxy: May 27 17:19:25.003763 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.7018 INFO http_proxy: May 27 17:19:25.079200 systemd[2218]: Queued start job for default target default.target. May 27 17:19:25.085949 systemd[2218]: Created slice app.slice - User Application Slice. May 27 17:19:25.086011 systemd[2218]: Reached target paths.target - Paths. May 27 17:19:25.086107 systemd[2218]: Reached target timers.target - Timers. May 27 17:19:25.089179 systemd[2218]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 17:19:25.102326 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.7023 INFO Checking if agent identity type OnPrem can be assumed May 27 17:19:25.114538 tar[1983]: linux-arm64/README.md May 27 17:19:25.125381 systemd[2218]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 17:19:25.125520 systemd[2218]: Reached target sockets.target - Sockets. May 27 17:19:25.125609 systemd[2218]: Reached target basic.target - Basic System. May 27 17:19:25.125693 systemd[2218]: Reached target default.target - Main User Target. May 27 17:19:25.125766 systemd[2218]: Startup finished in 430ms. May 27 17:19:25.126047 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 17:19:25.138249 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 17:19:25.160343 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 17:19:25.201357 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.7023 INFO Checking if agent identity type EC2 can be assumed May 27 17:19:25.300983 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8053 INFO Agent will take identity from EC2 May 27 17:19:25.312133 systemd[1]: Started sshd@1-172.31.16.30:22-139.178.68.195:34790.service - OpenSSH per-connection server daemon (139.178.68.195:34790). May 27 17:19:25.401796 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8069 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.3.0.0 May 27 17:19:25.501679 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8069 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 May 27 17:19:25.520023 amazon-ssm-agent[2201]: 2025/05/27 17:19:25 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:25.520023 amazon-ssm-agent[2201]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 17:19:25.520178 amazon-ssm-agent[2201]: 2025/05/27 17:19:25 processing appconfig overrides May 27 17:19:25.521833 sshd[2236]: Accepted publickey for core from 139.178.68.195 port 34790 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:25.524976 sshd-session[2236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:25.537514 systemd-logind[1976]: New session 2 of user core. May 27 17:19:25.546169 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 17:19:25.553960 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8069 INFO [amazon-ssm-agent] Starting Core Agent May 27 17:19:25.553960 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8069 INFO [amazon-ssm-agent] Registrar detected. Attempting registration May 27 17:19:25.553960 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8069 INFO [Registrar] Starting registrar module May 27 17:19:25.553960 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8083 INFO [EC2Identity] Checking disk for registration info May 27 17:19:25.553960 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8084 INFO [EC2Identity] No registration info found for ec2 instance, attempting registration May 27 17:19:25.553960 amazon-ssm-agent[2201]: 2025-05-27 17:19:24.8084 INFO [EC2Identity] Generating registration keypair May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.4757 INFO [EC2Identity] Checking write access before registering May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.4764 INFO [EC2Identity] Registering EC2 instance with Systems Manager May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.5197 INFO [EC2Identity] EC2 registration was successful. May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.5197 INFO [amazon-ssm-agent] Registration attempted. Resuming core agent startup. May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.5199 INFO [CredentialRefresher] credentialRefresher has started May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.5199 INFO [CredentialRefresher] Starting credentials refresher loop May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.5534 INFO EC2RoleProvider Successfully connected with instance profile role credentials May 27 17:19:25.555658 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.5537 INFO [CredentialRefresher] Credentials ready May 27 17:19:25.603369 amazon-ssm-agent[2201]: 2025-05-27 17:19:25.5541 INFO [CredentialRefresher] Next credential rotation will be in 29.9999892794 minutes May 27 17:19:25.674809 sshd[2238]: Connection closed by 139.178.68.195 port 34790 May 27 17:19:25.677159 sshd-session[2236]: pam_unix(sshd:session): session closed for user core May 27 17:19:25.684634 systemd[1]: sshd@1-172.31.16.30:22-139.178.68.195:34790.service: Deactivated successfully. May 27 17:19:25.688745 systemd[1]: session-2.scope: Deactivated successfully. May 27 17:19:25.691295 systemd-logind[1976]: Session 2 logged out. Waiting for processes to exit. May 27 17:19:25.718076 systemd-logind[1976]: Removed session 2. May 27 17:19:25.718346 systemd[1]: Started sshd@2-172.31.16.30:22-139.178.68.195:34798.service - OpenSSH per-connection server daemon (139.178.68.195:34798). May 27 17:19:25.914998 sshd[2244]: Accepted publickey for core from 139.178.68.195 port 34798 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:25.916510 sshd-session[2244]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:25.924570 systemd-logind[1976]: New session 3 of user core. May 27 17:19:25.938273 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 17:19:26.066404 sshd[2246]: Connection closed by 139.178.68.195 port 34798 May 27 17:19:26.067399 sshd-session[2244]: pam_unix(sshd:session): session closed for user core May 27 17:19:26.073847 systemd[1]: sshd@2-172.31.16.30:22-139.178.68.195:34798.service: Deactivated successfully. May 27 17:19:26.077539 systemd[1]: session-3.scope: Deactivated successfully. May 27 17:19:26.080227 systemd-logind[1976]: Session 3 logged out. Waiting for processes to exit. May 27 17:19:26.083543 systemd-logind[1976]: Removed session 3. May 27 17:19:26.580974 amazon-ssm-agent[2201]: 2025-05-27 17:19:26.5807 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process May 27 17:19:26.681992 amazon-ssm-agent[2201]: 2025-05-27 17:19:26.5873 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2253) started May 27 17:19:26.783164 amazon-ssm-agent[2201]: 2025-05-27 17:19:26.5873 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds May 27 17:19:26.908670 ntpd[1969]: Listen normally on 7 eth0 [fe80::4db:94ff:fef8:69d5%2]:123 May 27 17:19:26.909286 ntpd[1969]: 27 May 17:19:26 ntpd[1969]: Listen normally on 7 eth0 [fe80::4db:94ff:fef8:69d5%2]:123 May 27 17:19:27.368357 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:19:27.371825 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 17:19:27.379176 systemd[1]: Startup finished in 3.721s (kernel) + 9.699s (initrd) + 10.060s (userspace) = 23.480s. May 27 17:19:27.383034 (kubelet)[2269]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:19:29.119723 kubelet[2269]: E0527 17:19:29.119632 2269 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:19:29.124300 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:19:29.124619 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:19:29.125478 systemd[1]: kubelet.service: Consumed 1.334s CPU time, 259M memory peak. May 27 17:19:29.566164 systemd-resolved[1902]: Clock change detected. Flushing caches. May 27 17:19:35.760263 systemd[1]: Started sshd@3-172.31.16.30:22-139.178.68.195:55704.service - OpenSSH per-connection server daemon (139.178.68.195:55704). May 27 17:19:35.952345 sshd[2281]: Accepted publickey for core from 139.178.68.195 port 55704 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:35.954803 sshd-session[2281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:35.964327 systemd-logind[1976]: New session 4 of user core. May 27 17:19:35.972462 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 17:19:36.097578 sshd[2283]: Connection closed by 139.178.68.195 port 55704 May 27 17:19:36.098321 sshd-session[2281]: pam_unix(sshd:session): session closed for user core May 27 17:19:36.105426 systemd[1]: sshd@3-172.31.16.30:22-139.178.68.195:55704.service: Deactivated successfully. May 27 17:19:36.108612 systemd[1]: session-4.scope: Deactivated successfully. May 27 17:19:36.110667 systemd-logind[1976]: Session 4 logged out. Waiting for processes to exit. May 27 17:19:36.113802 systemd-logind[1976]: Removed session 4. May 27 17:19:36.135307 systemd[1]: Started sshd@4-172.31.16.30:22-139.178.68.195:55708.service - OpenSSH per-connection server daemon (139.178.68.195:55708). May 27 17:19:36.339412 sshd[2289]: Accepted publickey for core from 139.178.68.195 port 55708 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:36.341856 sshd-session[2289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:36.349707 systemd-logind[1976]: New session 5 of user core. May 27 17:19:36.365465 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 17:19:36.486056 sshd[2291]: Connection closed by 139.178.68.195 port 55708 May 27 17:19:36.486831 sshd-session[2289]: pam_unix(sshd:session): session closed for user core May 27 17:19:36.494157 systemd[1]: sshd@4-172.31.16.30:22-139.178.68.195:55708.service: Deactivated successfully. May 27 17:19:36.499359 systemd[1]: session-5.scope: Deactivated successfully. May 27 17:19:36.501452 systemd-logind[1976]: Session 5 logged out. Waiting for processes to exit. May 27 17:19:36.504251 systemd-logind[1976]: Removed session 5. May 27 17:19:36.520295 systemd[1]: Started sshd@5-172.31.16.30:22-139.178.68.195:55712.service - OpenSSH per-connection server daemon (139.178.68.195:55712). May 27 17:19:36.721057 sshd[2297]: Accepted publickey for core from 139.178.68.195 port 55712 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:36.723637 sshd-session[2297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:36.733267 systemd-logind[1976]: New session 6 of user core. May 27 17:19:36.739521 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 17:19:36.862957 sshd[2299]: Connection closed by 139.178.68.195 port 55712 May 27 17:19:36.863842 sshd-session[2297]: pam_unix(sshd:session): session closed for user core May 27 17:19:36.871270 systemd[1]: sshd@5-172.31.16.30:22-139.178.68.195:55712.service: Deactivated successfully. May 27 17:19:36.875909 systemd[1]: session-6.scope: Deactivated successfully. May 27 17:19:36.877772 systemd-logind[1976]: Session 6 logged out. Waiting for processes to exit. May 27 17:19:36.880472 systemd-logind[1976]: Removed session 6. May 27 17:19:36.902376 systemd[1]: Started sshd@6-172.31.16.30:22-139.178.68.195:55714.service - OpenSSH per-connection server daemon (139.178.68.195:55714). May 27 17:19:37.102717 sshd[2305]: Accepted publickey for core from 139.178.68.195 port 55714 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:37.104754 sshd-session[2305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:37.112874 systemd-logind[1976]: New session 7 of user core. May 27 17:19:37.125475 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 17:19:37.242386 sudo[2308]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 17:19:37.243556 sudo[2308]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:19:37.259222 sudo[2308]: pam_unix(sudo:session): session closed for user root May 27 17:19:37.282742 sshd[2307]: Connection closed by 139.178.68.195 port 55714 May 27 17:19:37.283798 sshd-session[2305]: pam_unix(sshd:session): session closed for user core May 27 17:19:37.290212 systemd[1]: session-7.scope: Deactivated successfully. May 27 17:19:37.290418 systemd-logind[1976]: Session 7 logged out. Waiting for processes to exit. May 27 17:19:37.291454 systemd[1]: sshd@6-172.31.16.30:22-139.178.68.195:55714.service: Deactivated successfully. May 27 17:19:37.298379 systemd-logind[1976]: Removed session 7. May 27 17:19:37.323680 systemd[1]: Started sshd@7-172.31.16.30:22-139.178.68.195:55728.service - OpenSSH per-connection server daemon (139.178.68.195:55728). May 27 17:19:37.532154 sshd[2314]: Accepted publickey for core from 139.178.68.195 port 55728 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:37.535277 sshd-session[2314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:37.543175 systemd-logind[1976]: New session 8 of user core. May 27 17:19:37.552472 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 17:19:37.657104 sudo[2318]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 17:19:37.657774 sudo[2318]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:19:37.665007 sudo[2318]: pam_unix(sudo:session): session closed for user root May 27 17:19:37.674645 sudo[2317]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 17:19:37.675217 sudo[2317]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:19:37.691694 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:19:37.754520 augenrules[2340]: No rules May 27 17:19:37.756984 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:19:37.758347 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:19:37.761085 sudo[2317]: pam_unix(sudo:session): session closed for user root May 27 17:19:37.785346 sshd[2316]: Connection closed by 139.178.68.195 port 55728 May 27 17:19:37.786075 sshd-session[2314]: pam_unix(sshd:session): session closed for user core May 27 17:19:37.793855 systemd-logind[1976]: Session 8 logged out. Waiting for processes to exit. May 27 17:19:37.795030 systemd[1]: sshd@7-172.31.16.30:22-139.178.68.195:55728.service: Deactivated successfully. May 27 17:19:37.798029 systemd[1]: session-8.scope: Deactivated successfully. May 27 17:19:37.802136 systemd-logind[1976]: Removed session 8. May 27 17:19:37.823654 systemd[1]: Started sshd@8-172.31.16.30:22-139.178.68.195:55736.service - OpenSSH per-connection server daemon (139.178.68.195:55736). May 27 17:19:38.018198 sshd[2349]: Accepted publickey for core from 139.178.68.195 port 55736 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:19:38.020682 sshd-session[2349]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:19:38.029076 systemd-logind[1976]: New session 9 of user core. May 27 17:19:38.041468 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 17:19:38.142672 sudo[2352]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 17:19:38.143785 sudo[2352]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:19:38.642621 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 17:19:38.658735 (dockerd)[2370]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 17:19:39.015441 dockerd[2370]: time="2025-05-27T17:19:39.013011062Z" level=info msg="Starting up" May 27 17:19:39.017589 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 17:19:39.019538 dockerd[2370]: time="2025-05-27T17:19:39.019480574Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 17:19:39.020810 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:19:39.124289 dockerd[2370]: time="2025-05-27T17:19:39.123814791Z" level=info msg="Loading containers: start." May 27 17:19:39.141271 kernel: Initializing XFRM netlink socket May 27 17:19:39.429470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:19:39.444786 (kubelet)[2480]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:19:39.528561 (udev-worker)[2396]: Network interface NamePolicy= disabled on kernel command line. May 27 17:19:39.548835 kubelet[2480]: E0527 17:19:39.548185 2480 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:19:39.564148 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:19:39.564516 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:19:39.567375 systemd[1]: kubelet.service: Consumed 331ms CPU time, 107.6M memory peak. May 27 17:19:39.617065 systemd-networkd[1901]: docker0: Link UP May 27 17:19:39.622062 dockerd[2370]: time="2025-05-27T17:19:39.621993761Z" level=info msg="Loading containers: done." May 27 17:19:39.649129 dockerd[2370]: time="2025-05-27T17:19:39.649072098Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 17:19:39.649367 dockerd[2370]: time="2025-05-27T17:19:39.649252362Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 17:19:39.649477 dockerd[2370]: time="2025-05-27T17:19:39.649442022Z" level=info msg="Initializing buildkit" May 27 17:19:39.685852 dockerd[2370]: time="2025-05-27T17:19:39.685178610Z" level=info msg="Completed buildkit initialization" May 27 17:19:39.701814 dockerd[2370]: time="2025-05-27T17:19:39.701580690Z" level=info msg="Daemon has completed initialization" May 27 17:19:39.702134 dockerd[2370]: time="2025-05-27T17:19:39.702060858Z" level=info msg="API listen on /run/docker.sock" May 27 17:19:39.702329 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 17:19:40.071393 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3764140965-merged.mount: Deactivated successfully. May 27 17:19:40.636407 containerd[2002]: time="2025-05-27T17:19:40.636250470Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\"" May 27 17:19:41.271568 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3427304886.mount: Deactivated successfully. May 27 17:19:43.221714 containerd[2002]: time="2025-05-27T17:19:43.221633791Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:43.223981 containerd[2002]: time="2025-05-27T17:19:43.223918507Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.1: active requests=0, bytes read=27349350" May 27 17:19:43.225781 containerd[2002]: time="2025-05-27T17:19:43.225720451Z" level=info msg="ImageCreate event name:\"sha256:9a2b7cf4f8540534c6ec5b758462c6d7885c6e734652172078bba899c0e3089a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:43.230178 containerd[2002]: time="2025-05-27T17:19:43.230118595Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:43.232251 containerd[2002]: time="2025-05-27T17:19:43.232104115Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.1\" with image id \"sha256:9a2b7cf4f8540534c6ec5b758462c6d7885c6e734652172078bba899c0e3089a\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\", size \"27346150\" in 2.595783985s" May 27 17:19:43.232251 containerd[2002]: time="2025-05-27T17:19:43.232157827Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\" returns image reference \"sha256:9a2b7cf4f8540534c6ec5b758462c6d7885c6e734652172078bba899c0e3089a\"" May 27 17:19:43.234624 containerd[2002]: time="2025-05-27T17:19:43.234575383Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\"" May 27 17:19:45.767879 containerd[2002]: time="2025-05-27T17:19:45.767792208Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:45.769564 containerd[2002]: time="2025-05-27T17:19:45.769460436Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.1: active requests=0, bytes read=23531735" May 27 17:19:45.771002 containerd[2002]: time="2025-05-27T17:19:45.770926020Z" level=info msg="ImageCreate event name:\"sha256:674996a72aa5900cbbbcd410437021fa4c62a7f829a56f58eb23ac430f2ae383\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:45.775465 containerd[2002]: time="2025-05-27T17:19:45.775378908Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:45.777497 containerd[2002]: time="2025-05-27T17:19:45.777298608Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.1\" with image id \"sha256:674996a72aa5900cbbbcd410437021fa4c62a7f829a56f58eb23ac430f2ae383\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\", size \"25086427\" in 2.542668229s" May 27 17:19:45.777497 containerd[2002]: time="2025-05-27T17:19:45.777353448Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\" returns image reference \"sha256:674996a72aa5900cbbbcd410437021fa4c62a7f829a56f58eb23ac430f2ae383\"" May 27 17:19:45.779199 containerd[2002]: time="2025-05-27T17:19:45.778921020Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\"" May 27 17:19:47.527377 containerd[2002]: time="2025-05-27T17:19:47.527293237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:47.529638 containerd[2002]: time="2025-05-27T17:19:47.529573969Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.1: active requests=0, bytes read=18293731" May 27 17:19:47.532482 containerd[2002]: time="2025-05-27T17:19:47.532422001Z" level=info msg="ImageCreate event name:\"sha256:014094c90caacf743dc5fb4281363492da1df31cd8218aeceab3be3326277d2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:47.542645 containerd[2002]: time="2025-05-27T17:19:47.542421085Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.1\" with image id \"sha256:014094c90caacf743dc5fb4281363492da1df31cd8218aeceab3be3326277d2e\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\", size \"19848441\" in 1.763447337s" May 27 17:19:47.542645 containerd[2002]: time="2025-05-27T17:19:47.542485489Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\" returns image reference \"sha256:014094c90caacf743dc5fb4281363492da1df31cd8218aeceab3be3326277d2e\"" May 27 17:19:47.542855 containerd[2002]: time="2025-05-27T17:19:47.542781397Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:47.543491 containerd[2002]: time="2025-05-27T17:19:47.543455569Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\"" May 27 17:19:49.201721 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount188816613.mount: Deactivated successfully. May 27 17:19:49.815127 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 17:19:49.820550 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:19:49.847650 containerd[2002]: time="2025-05-27T17:19:49.847573336Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:49.850165 containerd[2002]: time="2025-05-27T17:19:49.850086556Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.1: active requests=0, bytes read=28196004" May 27 17:19:49.851066 containerd[2002]: time="2025-05-27T17:19:49.851001232Z" level=info msg="ImageCreate event name:\"sha256:3e58848989f556e36aa29d7852ab1712163960651e074d11cae9d31fb27192db\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:49.856422 containerd[2002]: time="2025-05-27T17:19:49.856363432Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:49.857601 containerd[2002]: time="2025-05-27T17:19:49.857544136Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.1\" with image id \"sha256:3e58848989f556e36aa29d7852ab1712163960651e074d11cae9d31fb27192db\", repo tag \"registry.k8s.io/kube-proxy:v1.33.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\", size \"28195023\" in 2.313995099s" May 27 17:19:49.857708 containerd[2002]: time="2025-05-27T17:19:49.857598448Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\" returns image reference \"sha256:3e58848989f556e36aa29d7852ab1712163960651e074d11cae9d31fb27192db\"" May 27 17:19:49.860027 containerd[2002]: time="2025-05-27T17:19:49.859708048Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" May 27 17:19:50.152041 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:19:50.166979 (kubelet)[2667]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:19:50.235329 kubelet[2667]: E0527 17:19:50.235269 2667 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:19:50.241842 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:19:50.242161 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:19:50.243116 systemd[1]: kubelet.service: Consumed 290ms CPU time, 104.4M memory peak. May 27 17:19:50.480517 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3525655760.mount: Deactivated successfully. May 27 17:19:51.989617 containerd[2002]: time="2025-05-27T17:19:51.989547667Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:51.991969 containerd[2002]: time="2025-05-27T17:19:51.991903243Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" May 27 17:19:51.992762 containerd[2002]: time="2025-05-27T17:19:51.992502799Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:51.997285 containerd[2002]: time="2025-05-27T17:19:51.997217923Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:51.999474 containerd[2002]: time="2025-05-27T17:19:51.999417415Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 2.139652367s" May 27 17:19:51.999898 containerd[2002]: time="2025-05-27T17:19:51.999472435Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" May 27 17:19:52.000843 containerd[2002]: time="2025-05-27T17:19:52.000039591Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 17:19:52.466437 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3047647970.mount: Deactivated successfully. May 27 17:19:52.479862 containerd[2002]: time="2025-05-27T17:19:52.479788781Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:19:52.481647 containerd[2002]: time="2025-05-27T17:19:52.481581365Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" May 27 17:19:52.484122 containerd[2002]: time="2025-05-27T17:19:52.484040069Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:19:52.488752 containerd[2002]: time="2025-05-27T17:19:52.488654681Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:19:52.490460 containerd[2002]: time="2025-05-27T17:19:52.490262225Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 490.162106ms" May 27 17:19:52.490460 containerd[2002]: time="2025-05-27T17:19:52.490315817Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 27 17:19:52.491108 containerd[2002]: time="2025-05-27T17:19:52.491043485Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" May 27 17:19:54.055300 systemd[1]: systemd-hostnamed.service: Deactivated successfully. May 27 17:19:56.546664 containerd[2002]: time="2025-05-27T17:19:56.546578902Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:56.549769 containerd[2002]: time="2025-05-27T17:19:56.549715426Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69230163" May 27 17:19:56.552445 containerd[2002]: time="2025-05-27T17:19:56.552393574Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:56.557835 containerd[2002]: time="2025-05-27T17:19:56.557757382Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:19:56.560606 containerd[2002]: time="2025-05-27T17:19:56.559954678Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 4.068849621s" May 27 17:19:56.560606 containerd[2002]: time="2025-05-27T17:19:56.560013766Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" May 27 17:20:00.305326 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 17:20:00.310560 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:20:00.645464 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:20:00.657688 (kubelet)[2773]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:20:00.737061 kubelet[2773]: E0527 17:20:00.736999 2773 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:20:00.742734 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:20:00.743199 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:20:00.744079 systemd[1]: kubelet.service: Consumed 285ms CPU time, 107.2M memory peak. May 27 17:20:05.511071 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:20:05.511485 systemd[1]: kubelet.service: Consumed 285ms CPU time, 107.2M memory peak. May 27 17:20:05.515321 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:20:05.565041 systemd[1]: Reload requested from client PID 2788 ('systemctl') (unit session-9.scope)... May 27 17:20:05.565272 systemd[1]: Reloading... May 27 17:20:05.811295 zram_generator::config[2835]: No configuration found. May 27 17:20:06.006346 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:20:06.265331 systemd[1]: Reloading finished in 699 ms. May 27 17:20:06.362156 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 17:20:06.362442 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 17:20:06.362933 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:20:06.363004 systemd[1]: kubelet.service: Consumed 214ms CPU time, 95M memory peak. May 27 17:20:06.367862 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:20:06.712907 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:20:06.725765 (kubelet)[2896]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:20:06.797203 kubelet[2896]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:20:06.797203 kubelet[2896]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 17:20:06.797203 kubelet[2896]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:20:06.797203 kubelet[2896]: I0527 17:20:06.797067 2896 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:20:07.569349 update_engine[1978]: I20250527 17:20:07.569262 1978 update_attempter.cc:509] Updating boot flags... May 27 17:20:07.830336 kubelet[2896]: I0527 17:20:07.829625 2896 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 17:20:07.830336 kubelet[2896]: I0527 17:20:07.829677 2896 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:20:07.830336 kubelet[2896]: I0527 17:20:07.830101 2896 server.go:956] "Client rotation is on, will bootstrap in background" May 27 17:20:07.907352 kubelet[2896]: E0527 17:20:07.905341 2896 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://172.31.16.30:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" May 27 17:20:07.922822 kubelet[2896]: I0527 17:20:07.922609 2896 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:20:07.973064 kubelet[2896]: I0527 17:20:07.973013 2896 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:20:07.989681 kubelet[2896]: I0527 17:20:07.989626 2896 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:20:07.992472 kubelet[2896]: I0527 17:20:07.991003 2896 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:20:07.992472 kubelet[2896]: I0527 17:20:07.991067 2896 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-16-30","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:20:07.992472 kubelet[2896]: I0527 17:20:07.991438 2896 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:20:07.992472 kubelet[2896]: I0527 17:20:07.991491 2896 container_manager_linux.go:303] "Creating device plugin manager" May 27 17:20:07.993212 kubelet[2896]: I0527 17:20:07.993154 2896 state_mem.go:36] "Initialized new in-memory state store" May 27 17:20:08.006047 kubelet[2896]: I0527 17:20:08.005988 2896 kubelet.go:480] "Attempting to sync node with API server" May 27 17:20:08.006047 kubelet[2896]: I0527 17:20:08.006037 2896 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:20:08.006260 kubelet[2896]: I0527 17:20:08.006085 2896 kubelet.go:386] "Adding apiserver pod source" May 27 17:20:08.006260 kubelet[2896]: I0527 17:20:08.006109 2896 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:20:08.015617 kubelet[2896]: E0527 17:20:08.015557 2896 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://172.31.16.30:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 27 17:20:08.019860 kubelet[2896]: E0527 17:20:08.016156 2896 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://172.31.16.30:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-16-30&limit=500&resourceVersion=0\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 27 17:20:08.020035 kubelet[2896]: I0527 17:20:08.020012 2896 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:20:08.021062 kubelet[2896]: I0527 17:20:08.021017 2896 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 17:20:08.021169 kubelet[2896]: W0527 17:20:08.021143 2896 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 17:20:08.038252 kubelet[2896]: I0527 17:20:08.036654 2896 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 17:20:08.038252 kubelet[2896]: I0527 17:20:08.036730 2896 server.go:1289] "Started kubelet" May 27 17:20:08.046256 kubelet[2896]: I0527 17:20:08.045163 2896 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:20:08.053805 kubelet[2896]: I0527 17:20:08.053767 2896 server.go:317] "Adding debug handlers to kubelet server" May 27 17:20:08.059012 kubelet[2896]: I0527 17:20:08.058265 2896 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:20:08.072111 kubelet[2896]: I0527 17:20:08.072058 2896 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:20:08.073465 kubelet[2896]: I0527 17:20:08.073423 2896 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:20:08.078259 kubelet[2896]: I0527 17:20:08.077123 2896 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:20:08.084060 kubelet[2896]: E0527 17:20:08.077658 2896 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.16.30:6443/api/v1/namespaces/default/events\": dial tcp 172.31.16.30:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-16-30.184371fc20409393 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-16-30,UID:ip-172-31-16-30,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-16-30,},FirstTimestamp:2025-05-27 17:20:08.036684691 +0000 UTC m=+1.304166368,LastTimestamp:2025-05-27 17:20:08.036684691 +0000 UTC m=+1.304166368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-16-30,}" May 27 17:20:08.088427 kubelet[2896]: I0527 17:20:08.088380 2896 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 17:20:08.088858 kubelet[2896]: E0527 17:20:08.088809 2896 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-16-30\" not found" May 27 17:20:08.100043 kubelet[2896]: I0527 17:20:08.099996 2896 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 17:20:08.100375 kubelet[2896]: I0527 17:20:08.100338 2896 reconciler.go:26] "Reconciler: start to sync state" May 27 17:20:08.103330 kubelet[2896]: E0527 17:20:08.101212 2896 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.16.30:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-16-30?timeout=10s\": dial tcp 172.31.16.30:6443: connect: connection refused" interval="200ms" May 27 17:20:08.108991 kubelet[2896]: I0527 17:20:08.108942 2896 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:20:08.112849 kubelet[2896]: E0527 17:20:08.112582 2896 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://172.31.16.30:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 27 17:20:08.121249 kubelet[2896]: I0527 17:20:08.121175 2896 factory.go:223] Registration of the containerd container factory successfully May 27 17:20:08.121392 kubelet[2896]: I0527 17:20:08.121218 2896 factory.go:223] Registration of the systemd container factory successfully May 27 17:20:08.129358 kubelet[2896]: E0527 17:20:08.129293 2896 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:20:08.189355 kubelet[2896]: E0527 17:20:08.189309 2896 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-16-30\" not found" May 27 17:20:08.209707 kubelet[2896]: I0527 17:20:08.209675 2896 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 17:20:08.210859 kubelet[2896]: I0527 17:20:08.210504 2896 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 17:20:08.210859 kubelet[2896]: I0527 17:20:08.210546 2896 state_mem.go:36] "Initialized new in-memory state store" May 27 17:20:08.213628 kubelet[2896]: I0527 17:20:08.213591 2896 policy_none.go:49] "None policy: Start" May 27 17:20:08.213793 kubelet[2896]: I0527 17:20:08.213773 2896 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 17:20:08.213916 kubelet[2896]: I0527 17:20:08.213897 2896 state_mem.go:35] "Initializing new in-memory state store" May 27 17:20:08.216893 kubelet[2896]: I0527 17:20:08.216810 2896 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 17:20:08.223328 kubelet[2896]: I0527 17:20:08.222396 2896 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 17:20:08.223328 kubelet[2896]: I0527 17:20:08.222447 2896 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 17:20:08.223328 kubelet[2896]: I0527 17:20:08.222481 2896 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 17:20:08.223328 kubelet[2896]: I0527 17:20:08.222496 2896 kubelet.go:2436] "Starting kubelet main sync loop" May 27 17:20:08.223328 kubelet[2896]: E0527 17:20:08.222560 2896 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:20:08.244270 kubelet[2896]: E0527 17:20:08.243596 2896 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://172.31.16.30:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" May 27 17:20:08.288941 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 17:20:08.290130 kubelet[2896]: E0527 17:20:08.289972 2896 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-16-30\" not found" May 27 17:20:08.305480 kubelet[2896]: E0527 17:20:08.305414 2896 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.16.30:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-16-30?timeout=10s\": dial tcp 172.31.16.30:6443: connect: connection refused" interval="400ms" May 27 17:20:08.327278 kubelet[2896]: E0527 17:20:08.323849 2896 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 27 17:20:08.390782 kubelet[2896]: E0527 17:20:08.390633 2896 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-16-30\" not found" May 27 17:20:08.395423 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 17:20:08.458356 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 17:20:08.479127 kubelet[2896]: E0527 17:20:08.478956 2896 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 17:20:08.480110 kubelet[2896]: I0527 17:20:08.480071 2896 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:20:08.480204 kubelet[2896]: I0527 17:20:08.480107 2896 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:20:08.481772 kubelet[2896]: I0527 17:20:08.481598 2896 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:20:08.484065 kubelet[2896]: E0527 17:20:08.484019 2896 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 17:20:08.484308 kubelet[2896]: E0527 17:20:08.484274 2896 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-16-30\" not found" May 27 17:20:08.545623 systemd[1]: Created slice kubepods-burstable-podc5423689eb325f6a5fd3f1e0726a8071.slice - libcontainer container kubepods-burstable-podc5423689eb325f6a5fd3f1e0726a8071.slice. May 27 17:20:08.559899 kubelet[2896]: E0527 17:20:08.559846 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:08.566995 systemd[1]: Created slice kubepods-burstable-poded11aebc94f6fbe99046160594c25bb9.slice - libcontainer container kubepods-burstable-poded11aebc94f6fbe99046160594c25bb9.slice. May 27 17:20:08.572114 kubelet[2896]: E0527 17:20:08.572066 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:08.577545 systemd[1]: Created slice kubepods-burstable-pod53d91a5f240f52b2a0174be8d6356f66.slice - libcontainer container kubepods-burstable-pod53d91a5f240f52b2a0174be8d6356f66.slice. May 27 17:20:08.581975 kubelet[2896]: E0527 17:20:08.581914 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:08.584022 kubelet[2896]: I0527 17:20:08.583562 2896 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-16-30" May 27 17:20:08.584458 kubelet[2896]: E0527 17:20:08.584400 2896 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.16.30:6443/api/v1/nodes\": dial tcp 172.31.16.30:6443: connect: connection refused" node="ip-172-31-16-30" May 27 17:20:08.604956 kubelet[2896]: I0527 17:20:08.604906 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5423689eb325f6a5fd3f1e0726a8071-k8s-certs\") pod \"kube-apiserver-ip-172-31-16-30\" (UID: \"c5423689eb325f6a5fd3f1e0726a8071\") " pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:08.605062 kubelet[2896]: I0527 17:20:08.604974 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5423689eb325f6a5fd3f1e0726a8071-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-16-30\" (UID: \"c5423689eb325f6a5fd3f1e0726a8071\") " pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:08.605062 kubelet[2896]: I0527 17:20:08.605016 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-kubeconfig\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:08.605178 kubelet[2896]: I0527 17:20:08.605052 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:08.605178 kubelet[2896]: I0527 17:20:08.605095 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/53d91a5f240f52b2a0174be8d6356f66-kubeconfig\") pod \"kube-scheduler-ip-172-31-16-30\" (UID: \"53d91a5f240f52b2a0174be8d6356f66\") " pod="kube-system/kube-scheduler-ip-172-31-16-30" May 27 17:20:08.605178 kubelet[2896]: I0527 17:20:08.605137 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5423689eb325f6a5fd3f1e0726a8071-ca-certs\") pod \"kube-apiserver-ip-172-31-16-30\" (UID: \"c5423689eb325f6a5fd3f1e0726a8071\") " pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:08.605178 kubelet[2896]: I0527 17:20:08.605170 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-ca-certs\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:08.605400 kubelet[2896]: I0527 17:20:08.605201 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:08.605400 kubelet[2896]: I0527 17:20:08.605269 2896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-k8s-certs\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:08.708136 kubelet[2896]: E0527 17:20:08.707087 2896 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.16.30:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-16-30?timeout=10s\": dial tcp 172.31.16.30:6443: connect: connection refused" interval="800ms" May 27 17:20:08.788107 kubelet[2896]: I0527 17:20:08.787975 2896 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-16-30" May 27 17:20:08.788854 kubelet[2896]: E0527 17:20:08.788794 2896 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.16.30:6443/api/v1/nodes\": dial tcp 172.31.16.30:6443: connect: connection refused" node="ip-172-31-16-30" May 27 17:20:08.863831 containerd[2002]: time="2025-05-27T17:20:08.863462987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-16-30,Uid:c5423689eb325f6a5fd3f1e0726a8071,Namespace:kube-system,Attempt:0,}" May 27 17:20:08.874291 containerd[2002]: time="2025-05-27T17:20:08.874215119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-16-30,Uid:ed11aebc94f6fbe99046160594c25bb9,Namespace:kube-system,Attempt:0,}" May 27 17:20:08.883971 containerd[2002]: time="2025-05-27T17:20:08.883900211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-16-30,Uid:53d91a5f240f52b2a0174be8d6356f66,Namespace:kube-system,Attempt:0,}" May 27 17:20:08.905514 kubelet[2896]: E0527 17:20:08.905434 2896 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://172.31.16.30:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 27 17:20:08.925865 containerd[2002]: time="2025-05-27T17:20:08.925805639Z" level=info msg="connecting to shim 6145441a6070f8a066e566489fc5836177f036c9d7547a86b2e2bb35249c91aa" address="unix:///run/containerd/s/4f5b8f94084f64964b5a4b0ec13ef6ce2e2572e6b2f2467922eba263d6b89da3" namespace=k8s.io protocol=ttrpc version=3 May 27 17:20:08.956854 containerd[2002]: time="2025-05-27T17:20:08.956794763Z" level=info msg="connecting to shim 3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157" address="unix:///run/containerd/s/884857095a10245ee19bae6880350278d2ee7a1b8d6a4d48e32cb8e03c7b9311" namespace=k8s.io protocol=ttrpc version=3 May 27 17:20:08.971842 containerd[2002]: time="2025-05-27T17:20:08.971432351Z" level=info msg="connecting to shim 87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84" address="unix:///run/containerd/s/ab5f99fc7ca4794a8b528b2235b7eb2ce93431916c775dcb31a23085b8c98b52" namespace=k8s.io protocol=ttrpc version=3 May 27 17:20:09.026701 systemd[1]: Started cri-containerd-3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157.scope - libcontainer container 3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157. May 27 17:20:09.046584 systemd[1]: Started cri-containerd-6145441a6070f8a066e566489fc5836177f036c9d7547a86b2e2bb35249c91aa.scope - libcontainer container 6145441a6070f8a066e566489fc5836177f036c9d7547a86b2e2bb35249c91aa. May 27 17:20:09.064790 systemd[1]: Started cri-containerd-87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84.scope - libcontainer container 87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84. May 27 17:20:09.183059 containerd[2002]: time="2025-05-27T17:20:09.182993252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-16-30,Uid:ed11aebc94f6fbe99046160594c25bb9,Namespace:kube-system,Attempt:0,} returns sandbox id \"3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157\"" May 27 17:20:09.194609 kubelet[2896]: I0527 17:20:09.194572 2896 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-16-30" May 27 17:20:09.196086 kubelet[2896]: E0527 17:20:09.195885 2896 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.16.30:6443/api/v1/nodes\": dial tcp 172.31.16.30:6443: connect: connection refused" node="ip-172-31-16-30" May 27 17:20:09.197721 containerd[2002]: time="2025-05-27T17:20:09.197670368Z" level=info msg="CreateContainer within sandbox \"3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 17:20:09.198183 containerd[2002]: time="2025-05-27T17:20:09.197795156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-16-30,Uid:c5423689eb325f6a5fd3f1e0726a8071,Namespace:kube-system,Attempt:0,} returns sandbox id \"6145441a6070f8a066e566489fc5836177f036c9d7547a86b2e2bb35249c91aa\"" May 27 17:20:09.206430 containerd[2002]: time="2025-05-27T17:20:09.206370404Z" level=info msg="CreateContainer within sandbox \"6145441a6070f8a066e566489fc5836177f036c9d7547a86b2e2bb35249c91aa\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 17:20:09.216960 containerd[2002]: time="2025-05-27T17:20:09.216906332Z" level=info msg="Container 358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:09.221801 containerd[2002]: time="2025-05-27T17:20:09.221738732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-16-30,Uid:53d91a5f240f52b2a0174be8d6356f66,Namespace:kube-system,Attempt:0,} returns sandbox id \"87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84\"" May 27 17:20:09.227737 containerd[2002]: time="2025-05-27T17:20:09.226605260Z" level=info msg="Container 82eabdb8ed39a54fb81df5732fd248ec4ea7754357287beb7302866bc029a46b: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:09.233018 containerd[2002]: time="2025-05-27T17:20:09.232935273Z" level=info msg="CreateContainer within sandbox \"3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6\"" May 27 17:20:09.233542 containerd[2002]: time="2025-05-27T17:20:09.233432673Z" level=info msg="CreateContainer within sandbox \"87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 17:20:09.235118 containerd[2002]: time="2025-05-27T17:20:09.235059393Z" level=info msg="StartContainer for \"358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6\"" May 27 17:20:09.237113 containerd[2002]: time="2025-05-27T17:20:09.237056109Z" level=info msg="connecting to shim 358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6" address="unix:///run/containerd/s/884857095a10245ee19bae6880350278d2ee7a1b8d6a4d48e32cb8e03c7b9311" protocol=ttrpc version=3 May 27 17:20:09.242530 containerd[2002]: time="2025-05-27T17:20:09.242451213Z" level=info msg="CreateContainer within sandbox \"6145441a6070f8a066e566489fc5836177f036c9d7547a86b2e2bb35249c91aa\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"82eabdb8ed39a54fb81df5732fd248ec4ea7754357287beb7302866bc029a46b\"" May 27 17:20:09.244946 containerd[2002]: time="2025-05-27T17:20:09.244884009Z" level=info msg="StartContainer for \"82eabdb8ed39a54fb81df5732fd248ec4ea7754357287beb7302866bc029a46b\"" May 27 17:20:09.246904 containerd[2002]: time="2025-05-27T17:20:09.246828777Z" level=info msg="connecting to shim 82eabdb8ed39a54fb81df5732fd248ec4ea7754357287beb7302866bc029a46b" address="unix:///run/containerd/s/4f5b8f94084f64964b5a4b0ec13ef6ce2e2572e6b2f2467922eba263d6b89da3" protocol=ttrpc version=3 May 27 17:20:09.260723 kubelet[2896]: E0527 17:20:09.260666 2896 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://172.31.16.30:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-16-30&limit=500&resourceVersion=0\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 27 17:20:09.267493 containerd[2002]: time="2025-05-27T17:20:09.267433029Z" level=info msg="Container d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:09.284583 containerd[2002]: time="2025-05-27T17:20:09.284390049Z" level=info msg="CreateContainer within sandbox \"87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747\"" May 27 17:20:09.289267 containerd[2002]: time="2025-05-27T17:20:09.287486961Z" level=info msg="StartContainer for \"d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747\"" May 27 17:20:09.292212 containerd[2002]: time="2025-05-27T17:20:09.292041837Z" level=info msg="connecting to shim d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747" address="unix:///run/containerd/s/ab5f99fc7ca4794a8b528b2235b7eb2ce93431916c775dcb31a23085b8c98b52" protocol=ttrpc version=3 May 27 17:20:09.295823 systemd[1]: Started cri-containerd-82eabdb8ed39a54fb81df5732fd248ec4ea7754357287beb7302866bc029a46b.scope - libcontainer container 82eabdb8ed39a54fb81df5732fd248ec4ea7754357287beb7302866bc029a46b. May 27 17:20:09.298605 kubelet[2896]: E0527 17:20:09.298531 2896 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://172.31.16.30:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.16.30:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 27 17:20:09.320327 systemd[1]: Started cri-containerd-358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6.scope - libcontainer container 358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6. May 27 17:20:09.353537 systemd[1]: Started cri-containerd-d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747.scope - libcontainer container d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747. May 27 17:20:09.473067 containerd[2002]: time="2025-05-27T17:20:09.472879714Z" level=info msg="StartContainer for \"82eabdb8ed39a54fb81df5732fd248ec4ea7754357287beb7302866bc029a46b\" returns successfully" May 27 17:20:09.505124 containerd[2002]: time="2025-05-27T17:20:09.504464386Z" level=info msg="StartContainer for \"358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6\" returns successfully" May 27 17:20:09.508214 kubelet[2896]: E0527 17:20:09.508131 2896 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.16.30:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-16-30?timeout=10s\": dial tcp 172.31.16.30:6443: connect: connection refused" interval="1.6s" May 27 17:20:09.525773 containerd[2002]: time="2025-05-27T17:20:09.525696502Z" level=info msg="StartContainer for \"d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747\" returns successfully" May 27 17:20:10.000601 kubelet[2896]: I0527 17:20:09.999480 2896 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-16-30" May 27 17:20:10.284138 kubelet[2896]: E0527 17:20:10.283549 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:10.292200 kubelet[2896]: E0527 17:20:10.292151 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:10.297252 kubelet[2896]: E0527 17:20:10.297005 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:11.301259 kubelet[2896]: E0527 17:20:11.301181 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:11.303290 kubelet[2896]: E0527 17:20:11.302384 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:11.303290 kubelet[2896]: E0527 17:20:11.302991 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:12.304654 kubelet[2896]: E0527 17:20:12.304618 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:12.305844 kubelet[2896]: E0527 17:20:12.305812 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:12.306250 kubelet[2896]: E0527 17:20:12.303224 2896 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:12.919157 kubelet[2896]: E0527 17:20:12.919100 2896 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-16-30\" not found" node="ip-172-31-16-30" May 27 17:20:13.011282 kubelet[2896]: I0527 17:20:13.010654 2896 apiserver.go:52] "Watching apiserver" May 27 17:20:13.021165 kubelet[2896]: I0527 17:20:13.021110 2896 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-16-30" May 27 17:20:13.095355 kubelet[2896]: I0527 17:20:13.095298 2896 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:13.106972 kubelet[2896]: I0527 17:20:13.106733 2896 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 17:20:13.153326 kubelet[2896]: E0527 17:20:13.153267 2896 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-16-30\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:13.153326 kubelet[2896]: I0527 17:20:13.153318 2896 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:13.171158 kubelet[2896]: E0527 17:20:13.170965 2896 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-16-30\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:13.171158 kubelet[2896]: I0527 17:20:13.171033 2896 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-16-30" May 27 17:20:13.193588 kubelet[2896]: E0527 17:20:13.193536 2896 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-16-30\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-16-30" May 27 17:20:13.894253 kubelet[2896]: I0527 17:20:13.894089 2896 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:16.422765 systemd[1]: Reload requested from client PID 3363 ('systemctl') (unit session-9.scope)... May 27 17:20:16.423280 systemd[1]: Reloading... May 27 17:20:16.612329 zram_generator::config[3410]: No configuration found. May 27 17:20:16.801186 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:20:17.099283 systemd[1]: Reloading finished in 675 ms. May 27 17:20:17.142491 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:20:17.159626 systemd[1]: kubelet.service: Deactivated successfully. May 27 17:20:17.161318 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:20:17.161412 systemd[1]: kubelet.service: Consumed 1.953s CPU time, 125.8M memory peak. May 27 17:20:17.165154 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:20:17.555444 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:20:17.571907 (kubelet)[3467]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:20:17.681980 kubelet[3467]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:20:17.681980 kubelet[3467]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 17:20:17.681980 kubelet[3467]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:20:17.683118 kubelet[3467]: I0527 17:20:17.682705 3467 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:20:17.702607 kubelet[3467]: I0527 17:20:17.702563 3467 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 17:20:17.703264 kubelet[3467]: I0527 17:20:17.702778 3467 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:20:17.703502 kubelet[3467]: I0527 17:20:17.703475 3467 server.go:956] "Client rotation is on, will bootstrap in background" May 27 17:20:17.706052 kubelet[3467]: I0527 17:20:17.706013 3467 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" May 27 17:20:17.711113 kubelet[3467]: I0527 17:20:17.711070 3467 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:20:17.738839 kubelet[3467]: I0527 17:20:17.738800 3467 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:20:17.746212 kubelet[3467]: I0527 17:20:17.746076 3467 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:20:17.746655 kubelet[3467]: I0527 17:20:17.746611 3467 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:20:17.747050 kubelet[3467]: I0527 17:20:17.746770 3467 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-16-30","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:20:17.747312 kubelet[3467]: I0527 17:20:17.747287 3467 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:20:17.747419 kubelet[3467]: I0527 17:20:17.747402 3467 container_manager_linux.go:303] "Creating device plugin manager" May 27 17:20:17.747602 kubelet[3467]: I0527 17:20:17.747584 3467 state_mem.go:36] "Initialized new in-memory state store" May 27 17:20:17.747997 kubelet[3467]: I0527 17:20:17.747948 3467 kubelet.go:480] "Attempting to sync node with API server" May 27 17:20:17.748782 kubelet[3467]: I0527 17:20:17.748719 3467 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:20:17.748986 kubelet[3467]: I0527 17:20:17.748947 3467 kubelet.go:386] "Adding apiserver pod source" May 27 17:20:17.750310 kubelet[3467]: I0527 17:20:17.750276 3467 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:20:17.759271 kubelet[3467]: I0527 17:20:17.758644 3467 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:20:17.760675 kubelet[3467]: I0527 17:20:17.760637 3467 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 17:20:17.767818 kubelet[3467]: I0527 17:20:17.767743 3467 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 17:20:17.769481 kubelet[3467]: I0527 17:20:17.769440 3467 server.go:1289] "Started kubelet" May 27 17:20:17.776536 kubelet[3467]: I0527 17:20:17.776214 3467 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:20:17.794773 kubelet[3467]: I0527 17:20:17.793290 3467 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:20:17.811876 kubelet[3467]: I0527 17:20:17.810586 3467 server.go:317] "Adding debug handlers to kubelet server" May 27 17:20:17.813639 kubelet[3467]: I0527 17:20:17.802789 3467 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:20:17.828745 kubelet[3467]: I0527 17:20:17.828711 3467 factory.go:223] Registration of the systemd container factory successfully May 27 17:20:17.829081 kubelet[3467]: I0527 17:20:17.829045 3467 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:20:17.845384 kubelet[3467]: I0527 17:20:17.805548 3467 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 17:20:17.851267 kubelet[3467]: I0527 17:20:17.805567 3467 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 17:20:17.853087 kubelet[3467]: I0527 17:20:17.794993 3467 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:20:17.853087 kubelet[3467]: I0527 17:20:17.852198 3467 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:20:17.853087 kubelet[3467]: I0527 17:20:17.852571 3467 reconciler.go:26] "Reconciler: start to sync state" May 27 17:20:17.853393 kubelet[3467]: E0527 17:20:17.805767 3467 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-16-30\" not found" May 27 17:20:17.865898 kubelet[3467]: E0527 17:20:17.865858 3467 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:20:17.872266 kubelet[3467]: I0527 17:20:17.872058 3467 factory.go:223] Registration of the containerd container factory successfully May 27 17:20:17.932488 kubelet[3467]: I0527 17:20:17.932342 3467 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 17:20:17.951114 kubelet[3467]: I0527 17:20:17.949181 3467 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 17:20:17.951114 kubelet[3467]: I0527 17:20:17.949647 3467 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 17:20:17.951114 kubelet[3467]: I0527 17:20:17.949797 3467 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 17:20:17.951114 kubelet[3467]: I0527 17:20:17.949815 3467 kubelet.go:2436] "Starting kubelet main sync loop" May 27 17:20:17.956137 kubelet[3467]: E0527 17:20:17.954354 3467 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:20:18.061138 kubelet[3467]: E0527 17:20:18.059331 3467 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 27 17:20:18.096342 kubelet[3467]: I0527 17:20:18.096206 3467 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 17:20:18.096342 kubelet[3467]: I0527 17:20:18.096276 3467 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 17:20:18.096342 kubelet[3467]: I0527 17:20:18.096313 3467 state_mem.go:36] "Initialized new in-memory state store" May 27 17:20:18.098217 kubelet[3467]: I0527 17:20:18.096682 3467 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 17:20:18.098217 kubelet[3467]: I0527 17:20:18.096749 3467 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 17:20:18.098217 kubelet[3467]: I0527 17:20:18.096817 3467 policy_none.go:49] "None policy: Start" May 27 17:20:18.098217 kubelet[3467]: I0527 17:20:18.096845 3467 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 17:20:18.098217 kubelet[3467]: I0527 17:20:18.096866 3467 state_mem.go:35] "Initializing new in-memory state store" May 27 17:20:18.098217 kubelet[3467]: I0527 17:20:18.097210 3467 state_mem.go:75] "Updated machine memory state" May 27 17:20:18.112799 kubelet[3467]: E0527 17:20:18.112758 3467 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 17:20:18.113075 kubelet[3467]: I0527 17:20:18.113044 3467 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:20:18.113188 kubelet[3467]: I0527 17:20:18.113074 3467 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:20:18.120332 kubelet[3467]: I0527 17:20:18.119387 3467 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:20:18.126815 kubelet[3467]: E0527 17:20:18.126763 3467 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 17:20:18.255763 kubelet[3467]: I0527 17:20:18.255717 3467 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-16-30" May 27 17:20:18.263490 kubelet[3467]: I0527 17:20:18.262968 3467 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-16-30" May 27 17:20:18.263490 kubelet[3467]: I0527 17:20:18.261548 3467 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:18.264047 kubelet[3467]: I0527 17:20:18.264002 3467 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:18.280420 kubelet[3467]: I0527 17:20:18.280354 3467 kubelet_node_status.go:124] "Node was previously registered" node="ip-172-31-16-30" May 27 17:20:18.281094 kubelet[3467]: I0527 17:20:18.280635 3467 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-16-30" May 27 17:20:18.296578 kubelet[3467]: E0527 17:20:18.296526 3467 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-16-30\" already exists" pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:18.361258 kubelet[3467]: I0527 17:20:18.361088 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/53d91a5f240f52b2a0174be8d6356f66-kubeconfig\") pod \"kube-scheduler-ip-172-31-16-30\" (UID: \"53d91a5f240f52b2a0174be8d6356f66\") " pod="kube-system/kube-scheduler-ip-172-31-16-30" May 27 17:20:18.361258 kubelet[3467]: I0527 17:20:18.361162 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-ca-certs\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:18.361456 kubelet[3467]: I0527 17:20:18.361210 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:18.361510 kubelet[3467]: I0527 17:20:18.361481 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-k8s-certs\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:18.361560 kubelet[3467]: I0527 17:20:18.361520 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-kubeconfig\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:18.361616 kubelet[3467]: I0527 17:20:18.361556 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed11aebc94f6fbe99046160594c25bb9-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-16-30\" (UID: \"ed11aebc94f6fbe99046160594c25bb9\") " pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:18.361616 kubelet[3467]: I0527 17:20:18.361595 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5423689eb325f6a5fd3f1e0726a8071-ca-certs\") pod \"kube-apiserver-ip-172-31-16-30\" (UID: \"c5423689eb325f6a5fd3f1e0726a8071\") " pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:18.361720 kubelet[3467]: I0527 17:20:18.361628 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5423689eb325f6a5fd3f1e0726a8071-k8s-certs\") pod \"kube-apiserver-ip-172-31-16-30\" (UID: \"c5423689eb325f6a5fd3f1e0726a8071\") " pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:18.361720 kubelet[3467]: I0527 17:20:18.361666 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5423689eb325f6a5fd3f1e0726a8071-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-16-30\" (UID: \"c5423689eb325f6a5fd3f1e0726a8071\") " pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:18.751860 kubelet[3467]: I0527 17:20:18.751681 3467 apiserver.go:52] "Watching apiserver" May 27 17:20:18.851485 kubelet[3467]: I0527 17:20:18.851405 3467 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 17:20:19.013370 kubelet[3467]: I0527 17:20:19.013081 3467 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:19.014486 kubelet[3467]: I0527 17:20:19.014435 3467 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:19.030655 kubelet[3467]: E0527 17:20:19.030597 3467 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-16-30\" already exists" pod="kube-system/kube-apiserver-ip-172-31-16-30" May 27 17:20:19.037824 kubelet[3467]: E0527 17:20:19.037772 3467 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-16-30\" already exists" pod="kube-system/kube-controller-manager-ip-172-31-16-30" May 27 17:20:19.084884 kubelet[3467]: I0527 17:20:19.084662 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-16-30" podStartSLOduration=1.084617525 podStartE2EDuration="1.084617525s" podCreationTimestamp="2025-05-27 17:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:20:19.068370413 +0000 UTC m=+1.485250232" watchObservedRunningTime="2025-05-27 17:20:19.084617525 +0000 UTC m=+1.501497344" May 27 17:20:19.136613 kubelet[3467]: I0527 17:20:19.136367 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-16-30" podStartSLOduration=1.136346874 podStartE2EDuration="1.136346874s" podCreationTimestamp="2025-05-27 17:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:20:19.087865745 +0000 UTC m=+1.504745552" watchObservedRunningTime="2025-05-27 17:20:19.136346874 +0000 UTC m=+1.553226717" May 27 17:20:19.137210 kubelet[3467]: I0527 17:20:19.137002 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-16-30" podStartSLOduration=6.136981866 podStartE2EDuration="6.136981866s" podCreationTimestamp="2025-05-27 17:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:20:19.136112694 +0000 UTC m=+1.552992513" watchObservedRunningTime="2025-05-27 17:20:19.136981866 +0000 UTC m=+1.553861781" May 27 17:20:22.519888 kubelet[3467]: I0527 17:20:22.519833 3467 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 17:20:22.522224 containerd[2002]: time="2025-05-27T17:20:22.520876127Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 17:20:22.524367 kubelet[3467]: I0527 17:20:22.521606 3467 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 17:20:23.449882 systemd[1]: Created slice kubepods-besteffort-podf42a994b_4261_420c_865e_87a75c2fa587.slice - libcontainer container kubepods-besteffort-podf42a994b_4261_420c_865e_87a75c2fa587.slice. May 27 17:20:23.490879 kubelet[3467]: I0527 17:20:23.490808 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f42a994b-4261-420c-865e-87a75c2fa587-kube-proxy\") pod \"kube-proxy-hszck\" (UID: \"f42a994b-4261-420c-865e-87a75c2fa587\") " pod="kube-system/kube-proxy-hszck" May 27 17:20:23.490879 kubelet[3467]: I0527 17:20:23.490877 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f42a994b-4261-420c-865e-87a75c2fa587-xtables-lock\") pod \"kube-proxy-hszck\" (UID: \"f42a994b-4261-420c-865e-87a75c2fa587\") " pod="kube-system/kube-proxy-hszck" May 27 17:20:23.491093 kubelet[3467]: I0527 17:20:23.490922 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f42a994b-4261-420c-865e-87a75c2fa587-lib-modules\") pod \"kube-proxy-hszck\" (UID: \"f42a994b-4261-420c-865e-87a75c2fa587\") " pod="kube-system/kube-proxy-hszck" May 27 17:20:23.491093 kubelet[3467]: I0527 17:20:23.490958 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtvqp\" (UniqueName: \"kubernetes.io/projected/f42a994b-4261-420c-865e-87a75c2fa587-kube-api-access-wtvqp\") pod \"kube-proxy-hszck\" (UID: \"f42a994b-4261-420c-865e-87a75c2fa587\") " pod="kube-system/kube-proxy-hszck" May 27 17:20:23.715402 systemd[1]: Created slice kubepods-besteffort-pod5c044ac0_3299_4de0_bd19_31727d286198.slice - libcontainer container kubepods-besteffort-pod5c044ac0_3299_4de0_bd19_31727d286198.slice. May 27 17:20:23.763982 containerd[2002]: time="2025-05-27T17:20:23.763926409Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hszck,Uid:f42a994b-4261-420c-865e-87a75c2fa587,Namespace:kube-system,Attempt:0,}" May 27 17:20:23.792911 kubelet[3467]: I0527 17:20:23.792693 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5c044ac0-3299-4de0-bd19-31727d286198-var-lib-calico\") pod \"tigera-operator-844669ff44-ztkbd\" (UID: \"5c044ac0-3299-4de0-bd19-31727d286198\") " pod="tigera-operator/tigera-operator-844669ff44-ztkbd" May 27 17:20:23.792911 kubelet[3467]: I0527 17:20:23.792763 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzzv\" (UniqueName: \"kubernetes.io/projected/5c044ac0-3299-4de0-bd19-31727d286198-kube-api-access-gfzzv\") pod \"tigera-operator-844669ff44-ztkbd\" (UID: \"5c044ac0-3299-4de0-bd19-31727d286198\") " pod="tigera-operator/tigera-operator-844669ff44-ztkbd" May 27 17:20:23.802536 containerd[2002]: time="2025-05-27T17:20:23.802465825Z" level=info msg="connecting to shim 1787ec89962884ac5f414dc5a2477145c12f211273f7a839fc1e9222577a0855" address="unix:///run/containerd/s/50c3d3d5ac5bca06a5afe62593663287d7b99e3798d2777db6ca0130aa83c58c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:20:23.850553 systemd[1]: Started cri-containerd-1787ec89962884ac5f414dc5a2477145c12f211273f7a839fc1e9222577a0855.scope - libcontainer container 1787ec89962884ac5f414dc5a2477145c12f211273f7a839fc1e9222577a0855. May 27 17:20:23.902206 containerd[2002]: time="2025-05-27T17:20:23.902110297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hszck,Uid:f42a994b-4261-420c-865e-87a75c2fa587,Namespace:kube-system,Attempt:0,} returns sandbox id \"1787ec89962884ac5f414dc5a2477145c12f211273f7a839fc1e9222577a0855\"" May 27 17:20:23.915179 containerd[2002]: time="2025-05-27T17:20:23.915092941Z" level=info msg="CreateContainer within sandbox \"1787ec89962884ac5f414dc5a2477145c12f211273f7a839fc1e9222577a0855\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 17:20:23.941828 containerd[2002]: time="2025-05-27T17:20:23.941497994Z" level=info msg="Container e2a83ad8613a21b5ec2e76f5c9346fafa54c2a09d4b5f387610fe7a6f74d5aab: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:23.958350 containerd[2002]: time="2025-05-27T17:20:23.958299878Z" level=info msg="CreateContainer within sandbox \"1787ec89962884ac5f414dc5a2477145c12f211273f7a839fc1e9222577a0855\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e2a83ad8613a21b5ec2e76f5c9346fafa54c2a09d4b5f387610fe7a6f74d5aab\"" May 27 17:20:23.959532 containerd[2002]: time="2025-05-27T17:20:23.959475506Z" level=info msg="StartContainer for \"e2a83ad8613a21b5ec2e76f5c9346fafa54c2a09d4b5f387610fe7a6f74d5aab\"" May 27 17:20:23.967291 containerd[2002]: time="2025-05-27T17:20:23.967177766Z" level=info msg="connecting to shim e2a83ad8613a21b5ec2e76f5c9346fafa54c2a09d4b5f387610fe7a6f74d5aab" address="unix:///run/containerd/s/50c3d3d5ac5bca06a5afe62593663287d7b99e3798d2777db6ca0130aa83c58c" protocol=ttrpc version=3 May 27 17:20:24.002964 systemd[1]: Started cri-containerd-e2a83ad8613a21b5ec2e76f5c9346fafa54c2a09d4b5f387610fe7a6f74d5aab.scope - libcontainer container e2a83ad8613a21b5ec2e76f5c9346fafa54c2a09d4b5f387610fe7a6f74d5aab. May 27 17:20:24.027576 containerd[2002]: time="2025-05-27T17:20:24.027516034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-ztkbd,Uid:5c044ac0-3299-4de0-bd19-31727d286198,Namespace:tigera-operator,Attempt:0,}" May 27 17:20:24.080291 containerd[2002]: time="2025-05-27T17:20:24.079581862Z" level=info msg="connecting to shim 83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197" address="unix:///run/containerd/s/59beb8f324b51684c6071245244943b74c246f9f0e5df3bd4267649537e2acd6" namespace=k8s.io protocol=ttrpc version=3 May 27 17:20:24.111851 containerd[2002]: time="2025-05-27T17:20:24.111777910Z" level=info msg="StartContainer for \"e2a83ad8613a21b5ec2e76f5c9346fafa54c2a09d4b5f387610fe7a6f74d5aab\" returns successfully" May 27 17:20:24.134732 systemd[1]: Started cri-containerd-83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197.scope - libcontainer container 83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197. May 27 17:20:24.231184 containerd[2002]: time="2025-05-27T17:20:24.231063599Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-ztkbd,Uid:5c044ac0-3299-4de0-bd19-31727d286198,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197\"" May 27 17:20:24.235958 containerd[2002]: time="2025-05-27T17:20:24.235877099Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 17:20:24.615888 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2573009943.mount: Deactivated successfully. May 27 17:20:25.839579 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1207131890.mount: Deactivated successfully. May 27 17:20:26.972095 containerd[2002]: time="2025-05-27T17:20:26.972029489Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:26.973497 containerd[2002]: time="2025-05-27T17:20:26.973430537Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=22143480" May 27 17:20:26.974837 containerd[2002]: time="2025-05-27T17:20:26.974761553Z" level=info msg="ImageCreate event name:\"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:26.978269 containerd[2002]: time="2025-05-27T17:20:26.978172781Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:26.979961 containerd[2002]: time="2025-05-27T17:20:26.979786769Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"22139475\" in 2.743826798s" May 27 17:20:26.979961 containerd[2002]: time="2025-05-27T17:20:26.979836353Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\"" May 27 17:20:26.988631 containerd[2002]: time="2025-05-27T17:20:26.988525553Z" level=info msg="CreateContainer within sandbox \"83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 17:20:27.001320 containerd[2002]: time="2025-05-27T17:20:27.001052269Z" level=info msg="Container 168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:27.015073 containerd[2002]: time="2025-05-27T17:20:27.014989765Z" level=info msg="CreateContainer within sandbox \"83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\"" May 27 17:20:27.018670 containerd[2002]: time="2025-05-27T17:20:27.018547513Z" level=info msg="StartContainer for \"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\"" May 27 17:20:27.021642 containerd[2002]: time="2025-05-27T17:20:27.021477013Z" level=info msg="connecting to shim 168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556" address="unix:///run/containerd/s/59beb8f324b51684c6071245244943b74c246f9f0e5df3bd4267649537e2acd6" protocol=ttrpc version=3 May 27 17:20:27.060547 systemd[1]: Started cri-containerd-168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556.scope - libcontainer container 168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556. May 27 17:20:27.122186 containerd[2002]: time="2025-05-27T17:20:27.122096545Z" level=info msg="StartContainer for \"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\" returns successfully" May 27 17:20:27.454481 kubelet[3467]: I0527 17:20:27.454396 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hszck" podStartSLOduration=4.454372995 podStartE2EDuration="4.454372995s" podCreationTimestamp="2025-05-27 17:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:20:25.066672167 +0000 UTC m=+7.483551986" watchObservedRunningTime="2025-05-27 17:20:27.454372995 +0000 UTC m=+9.871252814" May 27 17:20:28.102763 kubelet[3467]: I0527 17:20:28.102493 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-ztkbd" podStartSLOduration=2.354909528 podStartE2EDuration="5.102469598s" podCreationTimestamp="2025-05-27 17:20:23 +0000 UTC" firstStartedPulling="2025-05-27 17:20:24.234154379 +0000 UTC m=+6.651034198" lastFinishedPulling="2025-05-27 17:20:26.981714449 +0000 UTC m=+9.398594268" observedRunningTime="2025-05-27 17:20:28.101704322 +0000 UTC m=+10.518584153" watchObservedRunningTime="2025-05-27 17:20:28.102469598 +0000 UTC m=+10.519349417" May 27 17:20:35.857109 sudo[2352]: pam_unix(sudo:session): session closed for user root May 27 17:20:35.882365 sshd[2351]: Connection closed by 139.178.68.195 port 55736 May 27 17:20:35.883161 sshd-session[2349]: pam_unix(sshd:session): session closed for user core May 27 17:20:35.894151 systemd[1]: sshd@8-172.31.16.30:22-139.178.68.195:55736.service: Deactivated successfully. May 27 17:20:35.902965 systemd[1]: session-9.scope: Deactivated successfully. May 27 17:20:35.906423 systemd[1]: session-9.scope: Consumed 12.622s CPU time, 232.4M memory peak. May 27 17:20:35.909300 systemd-logind[1976]: Session 9 logged out. Waiting for processes to exit. May 27 17:20:35.913879 systemd-logind[1976]: Removed session 9. May 27 17:20:46.830222 systemd[1]: Created slice kubepods-besteffort-podb60c37f9_398b_40e1_8334_c803e0fe28f0.slice - libcontainer container kubepods-besteffort-podb60c37f9_398b_40e1_8334_c803e0fe28f0.slice. May 27 17:20:46.856306 kubelet[3467]: I0527 17:20:46.856197 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9kzx\" (UniqueName: \"kubernetes.io/projected/b60c37f9-398b-40e1-8334-c803e0fe28f0-kube-api-access-m9kzx\") pod \"calico-typha-564c76db59-xwvmv\" (UID: \"b60c37f9-398b-40e1-8334-c803e0fe28f0\") " pod="calico-system/calico-typha-564c76db59-xwvmv" May 27 17:20:46.856306 kubelet[3467]: I0527 17:20:46.856292 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b60c37f9-398b-40e1-8334-c803e0fe28f0-tigera-ca-bundle\") pod \"calico-typha-564c76db59-xwvmv\" (UID: \"b60c37f9-398b-40e1-8334-c803e0fe28f0\") " pod="calico-system/calico-typha-564c76db59-xwvmv" May 27 17:20:46.856306 kubelet[3467]: I0527 17:20:46.856332 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b60c37f9-398b-40e1-8334-c803e0fe28f0-typha-certs\") pod \"calico-typha-564c76db59-xwvmv\" (UID: \"b60c37f9-398b-40e1-8334-c803e0fe28f0\") " pod="calico-system/calico-typha-564c76db59-xwvmv" May 27 17:20:47.140401 containerd[2002]: time="2025-05-27T17:20:47.139721601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-564c76db59-xwvmv,Uid:b60c37f9-398b-40e1-8334-c803e0fe28f0,Namespace:calico-system,Attempt:0,}" May 27 17:20:47.144581 systemd[1]: Created slice kubepods-besteffort-pod9e17904c_3c69_4e93_b60f_bacb8464260b.slice - libcontainer container kubepods-besteffort-pod9e17904c_3c69_4e93_b60f_bacb8464260b.slice. May 27 17:20:47.159067 kubelet[3467]: I0527 17:20:47.159015 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-var-run-calico\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.161362 kubelet[3467]: I0527 17:20:47.159345 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-cni-bin-dir\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.161663 kubelet[3467]: I0527 17:20:47.161636 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-cni-log-dir\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.161864 kubelet[3467]: I0527 17:20:47.161814 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e17904c-3c69-4e93-b60f-bacb8464260b-tigera-ca-bundle\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.162072 kubelet[3467]: I0527 17:20:47.162029 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-policysync\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.162250 kubelet[3467]: I0527 17:20:47.162203 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-flexvol-driver-host\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.162472 kubelet[3467]: I0527 17:20:47.162437 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrnrn\" (UniqueName: \"kubernetes.io/projected/9e17904c-3c69-4e93-b60f-bacb8464260b-kube-api-access-zrnrn\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.162635 kubelet[3467]: I0527 17:20:47.162600 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-cni-net-dir\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.162771 kubelet[3467]: I0527 17:20:47.162748 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-var-lib-calico\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.163072 kubelet[3467]: I0527 17:20:47.162924 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-xtables-lock\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.163373 kubelet[3467]: I0527 17:20:47.163200 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9e17904c-3c69-4e93-b60f-bacb8464260b-lib-modules\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.163532 kubelet[3467]: I0527 17:20:47.163481 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9e17904c-3c69-4e93-b60f-bacb8464260b-node-certs\") pod \"calico-node-44h7h\" (UID: \"9e17904c-3c69-4e93-b60f-bacb8464260b\") " pod="calico-system/calico-node-44h7h" May 27 17:20:47.216941 containerd[2002]: time="2025-05-27T17:20:47.216862617Z" level=info msg="connecting to shim 1b1e3cd95507a28aee2eeb97f4dddb87905bd1dc3ad3e386d0c3784ac01ee2cd" address="unix:///run/containerd/s/322ad6d38e11237cd8b7a15045c2b4902fb91ef205a37208b12776b42e566552" namespace=k8s.io protocol=ttrpc version=3 May 27 17:20:47.268374 kubelet[3467]: E0527 17:20:47.268203 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.268374 kubelet[3467]: W0527 17:20:47.268287 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.268374 kubelet[3467]: E0527 17:20:47.268340 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.272048 kubelet[3467]: E0527 17:20:47.270731 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.272048 kubelet[3467]: W0527 17:20:47.270845 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.272048 kubelet[3467]: E0527 17:20:47.270920 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.273257 kubelet[3467]: E0527 17:20:47.273195 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.273662 kubelet[3467]: W0527 17:20:47.273503 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.273662 kubelet[3467]: E0527 17:20:47.273547 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.274267 kubelet[3467]: E0527 17:20:47.274197 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.274473 kubelet[3467]: W0527 17:20:47.274400 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.274473 kubelet[3467]: E0527 17:20:47.274438 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.275387 kubelet[3467]: E0527 17:20:47.275324 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.275728 kubelet[3467]: W0527 17:20:47.275680 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.275934 kubelet[3467]: E0527 17:20:47.275726 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.280869 kubelet[3467]: E0527 17:20:47.280730 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.280869 kubelet[3467]: W0527 17:20:47.280765 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.280869 kubelet[3467]: E0527 17:20:47.280811 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.281334 systemd[1]: Started cri-containerd-1b1e3cd95507a28aee2eeb97f4dddb87905bd1dc3ad3e386d0c3784ac01ee2cd.scope - libcontainer container 1b1e3cd95507a28aee2eeb97f4dddb87905bd1dc3ad3e386d0c3784ac01ee2cd. May 27 17:20:47.284726 kubelet[3467]: E0527 17:20:47.283625 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.284726 kubelet[3467]: W0527 17:20:47.283666 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.284726 kubelet[3467]: E0527 17:20:47.283699 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.286445 kubelet[3467]: E0527 17:20:47.286160 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.286445 kubelet[3467]: W0527 17:20:47.286255 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.286445 kubelet[3467]: E0527 17:20:47.286290 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.286972 kubelet[3467]: E0527 17:20:47.286939 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.286972 kubelet[3467]: W0527 17:20:47.286968 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.287314 kubelet[3467]: E0527 17:20:47.286992 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.288585 kubelet[3467]: E0527 17:20:47.288550 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.288585 kubelet[3467]: W0527 17:20:47.288579 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.288976 kubelet[3467]: E0527 17:20:47.288606 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.290272 kubelet[3467]: E0527 17:20:47.289221 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.290272 kubelet[3467]: W0527 17:20:47.290263 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.290534 kubelet[3467]: E0527 17:20:47.290303 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.290933 kubelet[3467]: E0527 17:20:47.290897 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.290933 kubelet[3467]: W0527 17:20:47.290927 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.291267 kubelet[3467]: E0527 17:20:47.290953 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.291676 kubelet[3467]: E0527 17:20:47.291630 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.291676 kubelet[3467]: W0527 17:20:47.291659 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.292176 kubelet[3467]: E0527 17:20:47.291684 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.293331 kubelet[3467]: E0527 17:20:47.293274 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.293331 kubelet[3467]: W0527 17:20:47.293311 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.293501 kubelet[3467]: E0527 17:20:47.293343 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.295169 kubelet[3467]: E0527 17:20:47.295114 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.295406 kubelet[3467]: W0527 17:20:47.295168 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.295406 kubelet[3467]: E0527 17:20:47.295217 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.296622 kubelet[3467]: E0527 17:20:47.296575 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.296622 kubelet[3467]: W0527 17:20:47.296610 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.297034 kubelet[3467]: E0527 17:20:47.296642 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.300381 kubelet[3467]: E0527 17:20:47.300199 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.300381 kubelet[3467]: W0527 17:20:47.300272 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.300381 kubelet[3467]: E0527 17:20:47.300320 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.301851 kubelet[3467]: E0527 17:20:47.301649 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.301851 kubelet[3467]: W0527 17:20:47.301691 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.301851 kubelet[3467]: E0527 17:20:47.301723 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.336553 kubelet[3467]: E0527 17:20:47.336084 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.336951 kubelet[3467]: W0527 17:20:47.336814 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.336951 kubelet[3467]: E0527 17:20:47.336862 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.342850 kubelet[3467]: E0527 17:20:47.342813 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.343150 kubelet[3467]: W0527 17:20:47.343067 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.343324 kubelet[3467]: E0527 17:20:47.343109 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.399320 kubelet[3467]: E0527 17:20:47.398710 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rhxb" podUID="0b9f82fa-5aa9-4828-ae1b-71591df99003" May 27 17:20:47.432550 kubelet[3467]: E0527 17:20:47.432482 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.432801 kubelet[3467]: W0527 17:20:47.432728 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.433538 kubelet[3467]: E0527 17:20:47.432769 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.434290 kubelet[3467]: E0527 17:20:47.434157 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.434690 kubelet[3467]: W0527 17:20:47.434190 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.435257 kubelet[3467]: E0527 17:20:47.434814 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.435844 kubelet[3467]: E0527 17:20:47.435784 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.435844 kubelet[3467]: W0527 17:20:47.435811 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.436479 kubelet[3467]: E0527 17:20:47.436354 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.436971 kubelet[3467]: E0527 17:20:47.436949 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.437307 kubelet[3467]: W0527 17:20:47.437071 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.437307 kubelet[3467]: E0527 17:20:47.437100 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.437928 kubelet[3467]: E0527 17:20:47.437772 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.438155 kubelet[3467]: W0527 17:20:47.438045 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.438155 kubelet[3467]: E0527 17:20:47.438081 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.439835 kubelet[3467]: E0527 17:20:47.439670 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.439835 kubelet[3467]: W0527 17:20:47.439704 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.439835 kubelet[3467]: E0527 17:20:47.439735 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.443622 kubelet[3467]: E0527 17:20:47.443367 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.443622 kubelet[3467]: W0527 17:20:47.443405 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.443622 kubelet[3467]: E0527 17:20:47.443435 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.445093 kubelet[3467]: E0527 17:20:47.444750 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.445093 kubelet[3467]: W0527 17:20:47.444845 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.445093 kubelet[3467]: E0527 17:20:47.444879 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.446288 kubelet[3467]: E0527 17:20:47.446184 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.447723 kubelet[3467]: W0527 17:20:47.447601 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.448402 kubelet[3467]: E0527 17:20:47.448345 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.449873 kubelet[3467]: E0527 17:20:47.449825 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.450151 kubelet[3467]: W0527 17:20:47.450014 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.450151 kubelet[3467]: E0527 17:20:47.450058 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.450704 kubelet[3467]: E0527 17:20:47.450682 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.451079 kubelet[3467]: W0527 17:20:47.450877 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.451079 kubelet[3467]: E0527 17:20:47.450983 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.452245 kubelet[3467]: E0527 17:20:47.452166 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.452507 kubelet[3467]: W0527 17:20:47.452364 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.452507 kubelet[3467]: E0527 17:20:47.452402 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.453428 kubelet[3467]: E0527 17:20:47.453260 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.453428 kubelet[3467]: W0527 17:20:47.453310 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.453428 kubelet[3467]: E0527 17:20:47.453340 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.455285 kubelet[3467]: E0527 17:20:47.454473 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.455285 kubelet[3467]: W0527 17:20:47.454506 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.455285 kubelet[3467]: E0527 17:20:47.454537 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.456262 kubelet[3467]: E0527 17:20:47.456145 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.456262 kubelet[3467]: W0527 17:20:47.456177 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.456262 kubelet[3467]: E0527 17:20:47.456208 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.457513 kubelet[3467]: E0527 17:20:47.457482 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.457834 kubelet[3467]: W0527 17:20:47.457755 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.457834 kubelet[3467]: E0527 17:20:47.457793 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.460836 kubelet[3467]: E0527 17:20:47.460354 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.460836 kubelet[3467]: W0527 17:20:47.460586 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.460836 kubelet[3467]: E0527 17:20:47.460623 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.462640 kubelet[3467]: E0527 17:20:47.462557 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.463107 kubelet[3467]: W0527 17:20:47.462592 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.463107 kubelet[3467]: E0527 17:20:47.462846 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.464714 kubelet[3467]: E0527 17:20:47.464005 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.465104 kubelet[3467]: W0527 17:20:47.464899 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.465526 kubelet[3467]: E0527 17:20:47.465160 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.466823 kubelet[3467]: E0527 17:20:47.466761 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.467097 kubelet[3467]: W0527 17:20:47.466905 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.467426 kubelet[3467]: E0527 17:20:47.466938 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.468793 containerd[2002]: time="2025-05-27T17:20:47.468731254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-44h7h,Uid:9e17904c-3c69-4e93-b60f-bacb8464260b,Namespace:calico-system,Attempt:0,}" May 27 17:20:47.470012 kubelet[3467]: E0527 17:20:47.469889 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.470012 kubelet[3467]: W0527 17:20:47.469960 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.470380 kubelet[3467]: E0527 17:20:47.470177 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.470701 kubelet[3467]: I0527 17:20:47.470597 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0b9f82fa-5aa9-4828-ae1b-71591df99003-registration-dir\") pod \"csi-node-driver-5rhxb\" (UID: \"0b9f82fa-5aa9-4828-ae1b-71591df99003\") " pod="calico-system/csi-node-driver-5rhxb" May 27 17:20:47.471545 kubelet[3467]: E0527 17:20:47.471407 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.471545 kubelet[3467]: W0527 17:20:47.471486 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.472469 kubelet[3467]: E0527 17:20:47.471655 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.473150 kubelet[3467]: E0527 17:20:47.473025 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.473521 kubelet[3467]: W0527 17:20:47.473109 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.473772 kubelet[3467]: E0527 17:20:47.473600 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.474992 kubelet[3467]: E0527 17:20:47.474926 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.475423 kubelet[3467]: W0527 17:20:47.475085 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.475423 kubelet[3467]: E0527 17:20:47.475139 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.476208 kubelet[3467]: I0527 17:20:47.475605 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0b9f82fa-5aa9-4828-ae1b-71591df99003-socket-dir\") pod \"csi-node-driver-5rhxb\" (UID: \"0b9f82fa-5aa9-4828-ae1b-71591df99003\") " pod="calico-system/csi-node-driver-5rhxb" May 27 17:20:47.476647 kubelet[3467]: E0527 17:20:47.476551 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.476647 kubelet[3467]: W0527 17:20:47.476579 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.476647 kubelet[3467]: E0527 17:20:47.476607 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.477429 kubelet[3467]: E0527 17:20:47.477351 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.477429 kubelet[3467]: W0527 17:20:47.477379 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.477429 kubelet[3467]: E0527 17:20:47.477403 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.478066 kubelet[3467]: E0527 17:20:47.477992 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.478066 kubelet[3467]: W0527 17:20:47.478015 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.478066 kubelet[3467]: E0527 17:20:47.478036 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.478544 kubelet[3467]: I0527 17:20:47.478515 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfv8l\" (UniqueName: \"kubernetes.io/projected/0b9f82fa-5aa9-4828-ae1b-71591df99003-kube-api-access-qfv8l\") pod \"csi-node-driver-5rhxb\" (UID: \"0b9f82fa-5aa9-4828-ae1b-71591df99003\") " pod="calico-system/csi-node-driver-5rhxb" May 27 17:20:47.479423 kubelet[3467]: E0527 17:20:47.479349 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.480052 kubelet[3467]: W0527 17:20:47.479618 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.480052 kubelet[3467]: E0527 17:20:47.479662 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.482177 kubelet[3467]: E0527 17:20:47.481463 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.482177 kubelet[3467]: W0527 17:20:47.481491 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.482177 kubelet[3467]: E0527 17:20:47.481522 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.483889 kubelet[3467]: E0527 17:20:47.483622 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.483889 kubelet[3467]: W0527 17:20:47.483659 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.483889 kubelet[3467]: E0527 17:20:47.483690 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.483889 kubelet[3467]: I0527 17:20:47.483741 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b9f82fa-5aa9-4828-ae1b-71591df99003-kubelet-dir\") pod \"csi-node-driver-5rhxb\" (UID: \"0b9f82fa-5aa9-4828-ae1b-71591df99003\") " pod="calico-system/csi-node-driver-5rhxb" May 27 17:20:47.485722 kubelet[3467]: E0527 17:20:47.485678 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.486047 kubelet[3467]: W0527 17:20:47.485941 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.486047 kubelet[3467]: E0527 17:20:47.485984 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.486411 kubelet[3467]: I0527 17:20:47.486380 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0b9f82fa-5aa9-4828-ae1b-71591df99003-varrun\") pod \"csi-node-driver-5rhxb\" (UID: \"0b9f82fa-5aa9-4828-ae1b-71591df99003\") " pod="calico-system/csi-node-driver-5rhxb" May 27 17:20:47.486974 kubelet[3467]: E0527 17:20:47.486882 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.486974 kubelet[3467]: W0527 17:20:47.486912 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.486974 kubelet[3467]: E0527 17:20:47.486945 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.488469 kubelet[3467]: E0527 17:20:47.488371 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.488469 kubelet[3467]: W0527 17:20:47.488404 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.488469 kubelet[3467]: E0527 17:20:47.488438 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.489412 kubelet[3467]: E0527 17:20:47.489376 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.489703 kubelet[3467]: W0527 17:20:47.489533 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.489703 kubelet[3467]: E0527 17:20:47.489568 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.490760 kubelet[3467]: E0527 17:20:47.490456 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.490760 kubelet[3467]: W0527 17:20:47.490601 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.491819 kubelet[3467]: E0527 17:20:47.490634 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.526527 containerd[2002]: time="2025-05-27T17:20:47.526440407Z" level=info msg="connecting to shim 83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35" address="unix:///run/containerd/s/6d97480859b37c937efcea624ccf3d660ccc981017c2ca54b01c2fcb8088f6e7" namespace=k8s.io protocol=ttrpc version=3 May 27 17:20:47.587538 kubelet[3467]: E0527 17:20:47.587437 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.587538 kubelet[3467]: W0527 17:20:47.587471 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.587538 kubelet[3467]: E0527 17:20:47.587500 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.588587 kubelet[3467]: E0527 17:20:47.588478 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.588587 kubelet[3467]: W0527 17:20:47.588525 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.588587 kubelet[3467]: E0527 17:20:47.588557 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.590361 kubelet[3467]: E0527 17:20:47.590261 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.590361 kubelet[3467]: W0527 17:20:47.590295 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.590361 kubelet[3467]: E0527 17:20:47.590326 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.591571 kubelet[3467]: E0527 17:20:47.591537 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.592113 kubelet[3467]: W0527 17:20:47.591712 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.592113 kubelet[3467]: E0527 17:20:47.591751 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.594054 kubelet[3467]: E0527 17:20:47.593831 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.594054 kubelet[3467]: W0527 17:20:47.593988 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.594054 kubelet[3467]: E0527 17:20:47.594019 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.595494 kubelet[3467]: E0527 17:20:47.595220 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.595494 kubelet[3467]: W0527 17:20:47.595295 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.595494 kubelet[3467]: E0527 17:20:47.595326 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.596182 kubelet[3467]: E0527 17:20:47.596149 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.596535 kubelet[3467]: W0527 17:20:47.596361 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.596535 kubelet[3467]: E0527 17:20:47.596403 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.597401 kubelet[3467]: E0527 17:20:47.597333 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.597726 kubelet[3467]: W0527 17:20:47.597556 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.597726 kubelet[3467]: E0527 17:20:47.597596 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.599215 kubelet[3467]: E0527 17:20:47.599177 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.599663 kubelet[3467]: W0527 17:20:47.599465 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.599663 kubelet[3467]: E0527 17:20:47.599508 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.600444 kubelet[3467]: E0527 17:20:47.600415 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.600816 kubelet[3467]: W0527 17:20:47.600569 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.600816 kubelet[3467]: E0527 17:20:47.600617 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.603767 kubelet[3467]: E0527 17:20:47.603014 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.603767 kubelet[3467]: W0527 17:20:47.603047 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.603767 kubelet[3467]: E0527 17:20:47.603080 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.604540 kubelet[3467]: E0527 17:20:47.604280 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.604540 kubelet[3467]: W0527 17:20:47.604314 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.604540 kubelet[3467]: E0527 17:20:47.604345 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.605768 containerd[2002]: time="2025-05-27T17:20:47.605599739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-564c76db59-xwvmv,Uid:b60c37f9-398b-40e1-8334-c803e0fe28f0,Namespace:calico-system,Attempt:0,} returns sandbox id \"1b1e3cd95507a28aee2eeb97f4dddb87905bd1dc3ad3e386d0c3784ac01ee2cd\"" May 27 17:20:47.606197 kubelet[3467]: E0527 17:20:47.606168 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.607400 kubelet[3467]: W0527 17:20:47.607346 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.608077 kubelet[3467]: E0527 17:20:47.607891 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.608983 kubelet[3467]: E0527 17:20:47.608908 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.608983 kubelet[3467]: W0527 17:20:47.608936 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.609521 kubelet[3467]: E0527 17:20:47.609199 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.611515 kubelet[3467]: E0527 17:20:47.611393 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.611515 kubelet[3467]: W0527 17:20:47.611426 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.612377 kubelet[3467]: E0527 17:20:47.612302 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.614007 kubelet[3467]: E0527 17:20:47.613682 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.614007 kubelet[3467]: W0527 17:20:47.613927 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.614007 kubelet[3467]: E0527 17:20:47.613965 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.615957 kubelet[3467]: E0527 17:20:47.615835 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.615957 kubelet[3467]: W0527 17:20:47.615869 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.615957 kubelet[3467]: E0527 17:20:47.615923 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.616813 containerd[2002]: time="2025-05-27T17:20:47.616198247Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 17:20:47.618395 kubelet[3467]: E0527 17:20:47.618014 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.618395 kubelet[3467]: W0527 17:20:47.618166 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.618395 kubelet[3467]: E0527 17:20:47.618201 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.620979 kubelet[3467]: E0527 17:20:47.620941 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.620979 kubelet[3467]: W0527 17:20:47.621027 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.620979 kubelet[3467]: E0527 17:20:47.621063 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.623073 kubelet[3467]: E0527 17:20:47.622769 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.623073 kubelet[3467]: W0527 17:20:47.622815 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.623073 kubelet[3467]: E0527 17:20:47.622849 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.624620 kubelet[3467]: E0527 17:20:47.624576 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.624898 kubelet[3467]: W0527 17:20:47.624826 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.624898 kubelet[3467]: E0527 17:20:47.624868 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.626378 kubelet[3467]: E0527 17:20:47.626322 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.626952 kubelet[3467]: W0527 17:20:47.626514 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.627315 kubelet[3467]: E0527 17:20:47.626791 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.629845 kubelet[3467]: E0527 17:20:47.629806 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.630556 kubelet[3467]: W0527 17:20:47.630151 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.630556 kubelet[3467]: E0527 17:20:47.630198 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.632302 kubelet[3467]: E0527 17:20:47.632105 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.633116 kubelet[3467]: W0527 17:20:47.633075 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.634955 kubelet[3467]: E0527 17:20:47.634815 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.643437 kubelet[3467]: E0527 17:20:47.642189 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.643437 kubelet[3467]: W0527 17:20:47.642275 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.643437 kubelet[3467]: E0527 17:20:47.642318 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.670537 systemd[1]: Started cri-containerd-83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35.scope - libcontainer container 83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35. May 27 17:20:47.707150 kubelet[3467]: E0527 17:20:47.707091 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:47.707150 kubelet[3467]: W0527 17:20:47.707141 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:47.707150 kubelet[3467]: E0527 17:20:47.707175 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:47.811097 containerd[2002]: time="2025-05-27T17:20:47.811027344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-44h7h,Uid:9e17904c-3c69-4e93-b60f-bacb8464260b,Namespace:calico-system,Attempt:0,} returns sandbox id \"83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35\"" May 27 17:20:48.950978 kubelet[3467]: E0527 17:20:48.950619 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rhxb" podUID="0b9f82fa-5aa9-4828-ae1b-71591df99003" May 27 17:20:49.147441 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4195889436.mount: Deactivated successfully. May 27 17:20:50.052660 containerd[2002]: time="2025-05-27T17:20:50.052595147Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:50.053880 containerd[2002]: time="2025-05-27T17:20:50.053808071Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=33020269" May 27 17:20:50.055002 containerd[2002]: time="2025-05-27T17:20:50.054929627Z" level=info msg="ImageCreate event name:\"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:50.058154 containerd[2002]: time="2025-05-27T17:20:50.058072151Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:50.059644 containerd[2002]: time="2025-05-27T17:20:50.059417459Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"33020123\" in 2.442595428s" May 27 17:20:50.059644 containerd[2002]: time="2025-05-27T17:20:50.059471375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\"" May 27 17:20:50.063140 containerd[2002]: time="2025-05-27T17:20:50.062617871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 17:20:50.100719 containerd[2002]: time="2025-05-27T17:20:50.100659132Z" level=info msg="CreateContainer within sandbox \"1b1e3cd95507a28aee2eeb97f4dddb87905bd1dc3ad3e386d0c3784ac01ee2cd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 17:20:50.114541 containerd[2002]: time="2025-05-27T17:20:50.114469188Z" level=info msg="Container 395f34962471b2496bd1339fbfdb455b58a7cb37ba96b79f670332879418d25e: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:50.120032 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2860927566.mount: Deactivated successfully. May 27 17:20:50.132262 containerd[2002]: time="2025-05-27T17:20:50.132186060Z" level=info msg="CreateContainer within sandbox \"1b1e3cd95507a28aee2eeb97f4dddb87905bd1dc3ad3e386d0c3784ac01ee2cd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"395f34962471b2496bd1339fbfdb455b58a7cb37ba96b79f670332879418d25e\"" May 27 17:20:50.134561 containerd[2002]: time="2025-05-27T17:20:50.134474688Z" level=info msg="StartContainer for \"395f34962471b2496bd1339fbfdb455b58a7cb37ba96b79f670332879418d25e\"" May 27 17:20:50.136921 containerd[2002]: time="2025-05-27T17:20:50.136756656Z" level=info msg="connecting to shim 395f34962471b2496bd1339fbfdb455b58a7cb37ba96b79f670332879418d25e" address="unix:///run/containerd/s/322ad6d38e11237cd8b7a15045c2b4902fb91ef205a37208b12776b42e566552" protocol=ttrpc version=3 May 27 17:20:50.179898 systemd[1]: Started cri-containerd-395f34962471b2496bd1339fbfdb455b58a7cb37ba96b79f670332879418d25e.scope - libcontainer container 395f34962471b2496bd1339fbfdb455b58a7cb37ba96b79f670332879418d25e. May 27 17:20:50.264073 containerd[2002]: time="2025-05-27T17:20:50.263826540Z" level=info msg="StartContainer for \"395f34962471b2496bd1339fbfdb455b58a7cb37ba96b79f670332879418d25e\" returns successfully" May 27 17:20:50.950326 kubelet[3467]: E0527 17:20:50.950255 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rhxb" podUID="0b9f82fa-5aa9-4828-ae1b-71591df99003" May 27 17:20:51.193909 kubelet[3467]: E0527 17:20:51.193862 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.194435 kubelet[3467]: W0527 17:20:51.194215 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.194435 kubelet[3467]: E0527 17:20:51.194276 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.195007 kubelet[3467]: E0527 17:20:51.194972 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.195484 kubelet[3467]: W0527 17:20:51.195164 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.195484 kubelet[3467]: E0527 17:20:51.195278 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.197179 kubelet[3467]: E0527 17:20:51.196049 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.197179 kubelet[3467]: W0527 17:20:51.196173 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.197179 kubelet[3467]: E0527 17:20:51.196204 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.198284 kubelet[3467]: E0527 17:20:51.197854 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.199272 kubelet[3467]: W0527 17:20:51.198484 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.199272 kubelet[3467]: E0527 17:20:51.198533 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.203664 kubelet[3467]: E0527 17:20:51.203375 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.203664 kubelet[3467]: W0527 17:20:51.203410 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.203664 kubelet[3467]: E0527 17:20:51.203572 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.205893 kubelet[3467]: E0527 17:20:51.205598 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.206501 kubelet[3467]: W0527 17:20:51.206140 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.207102 kubelet[3467]: E0527 17:20:51.206185 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.208448 kubelet[3467]: E0527 17:20:51.208380 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.208448 kubelet[3467]: W0527 17:20:51.208412 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.208762 kubelet[3467]: E0527 17:20:51.208650 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.209464 kubelet[3467]: E0527 17:20:51.209424 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.209708 kubelet[3467]: W0527 17:20:51.209599 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.209708 kubelet[3467]: E0527 17:20:51.209634 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.210453 kubelet[3467]: E0527 17:20:51.210399 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.210649 kubelet[3467]: W0527 17:20:51.210428 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.210649 kubelet[3467]: E0527 17:20:51.210597 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.211159 kubelet[3467]: E0527 17:20:51.211104 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.211494 kubelet[3467]: W0527 17:20:51.211316 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.211494 kubelet[3467]: E0527 17:20:51.211349 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.212010 kubelet[3467]: E0527 17:20:51.211986 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.212294 kubelet[3467]: W0527 17:20:51.212076 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.212294 kubelet[3467]: E0527 17:20:51.212103 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.212808 kubelet[3467]: E0527 17:20:51.212677 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.213066 kubelet[3467]: W0527 17:20:51.212942 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.213066 kubelet[3467]: E0527 17:20:51.212975 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.213735 kubelet[3467]: E0527 17:20:51.213470 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.213735 kubelet[3467]: W0527 17:20:51.213581 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.213735 kubelet[3467]: E0527 17:20:51.213608 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.214464 kubelet[3467]: E0527 17:20:51.214267 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.214464 kubelet[3467]: W0527 17:20:51.214293 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.214464 kubelet[3467]: E0527 17:20:51.214317 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.215080 kubelet[3467]: E0527 17:20:51.214936 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.215080 kubelet[3467]: W0527 17:20:51.214961 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.215080 kubelet[3467]: E0527 17:20:51.214986 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.237356 kubelet[3467]: E0527 17:20:51.237127 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.237356 kubelet[3467]: W0527 17:20:51.237161 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.237356 kubelet[3467]: E0527 17:20:51.237191 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.238440 kubelet[3467]: E0527 17:20:51.238284 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.238676 kubelet[3467]: W0527 17:20:51.238567 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.238676 kubelet[3467]: E0527 17:20:51.238608 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.239648 kubelet[3467]: E0527 17:20:51.239617 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.239902 kubelet[3467]: W0527 17:20:51.239795 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.239902 kubelet[3467]: E0527 17:20:51.239831 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.240946 kubelet[3467]: E0527 17:20:51.240906 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.241386 kubelet[3467]: W0527 17:20:51.241104 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.241386 kubelet[3467]: E0527 17:20:51.241139 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.242089 kubelet[3467]: E0527 17:20:51.242063 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.242510 kubelet[3467]: W0527 17:20:51.242371 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.242704 kubelet[3467]: E0527 17:20:51.242610 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.243734 kubelet[3467]: E0527 17:20:51.243643 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.243734 kubelet[3467]: W0527 17:20:51.243678 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.243734 kubelet[3467]: E0527 17:20:51.243706 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.245838 kubelet[3467]: E0527 17:20:51.245665 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.246177 kubelet[3467]: W0527 17:20:51.246014 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.246177 kubelet[3467]: E0527 17:20:51.246058 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.247403 kubelet[3467]: E0527 17:20:51.247255 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.247403 kubelet[3467]: W0527 17:20:51.247326 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.247403 kubelet[3467]: E0527 17:20:51.247370 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.248923 kubelet[3467]: E0527 17:20:51.248752 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.248923 kubelet[3467]: W0527 17:20:51.248782 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.248923 kubelet[3467]: E0527 17:20:51.248810 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.249827 kubelet[3467]: E0527 17:20:51.249752 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.250086 kubelet[3467]: W0527 17:20:51.249930 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.250086 kubelet[3467]: E0527 17:20:51.249979 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.250925 kubelet[3467]: E0527 17:20:51.250856 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.251363 kubelet[3467]: W0527 17:20:51.251145 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.251363 kubelet[3467]: E0527 17:20:51.251184 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.252337 kubelet[3467]: E0527 17:20:51.251998 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.252337 kubelet[3467]: W0527 17:20:51.252081 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.252337 kubelet[3467]: E0527 17:20:51.252109 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.252899 kubelet[3467]: E0527 17:20:51.252827 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.252899 kubelet[3467]: W0527 17:20:51.252852 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.252899 kubelet[3467]: E0527 17:20:51.252875 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.254148 kubelet[3467]: E0527 17:20:51.254012 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.254148 kubelet[3467]: W0527 17:20:51.254039 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.254148 kubelet[3467]: E0527 17:20:51.254065 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.255032 kubelet[3467]: E0527 17:20:51.254761 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.255032 kubelet[3467]: W0527 17:20:51.254788 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.255032 kubelet[3467]: E0527 17:20:51.254819 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.255582 kubelet[3467]: E0527 17:20:51.255513 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.255582 kubelet[3467]: W0527 17:20:51.255537 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.255582 kubelet[3467]: E0527 17:20:51.255558 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.256441 kubelet[3467]: E0527 17:20:51.256412 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.257068 kubelet[3467]: W0527 17:20:51.256885 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.257068 kubelet[3467]: E0527 17:20:51.256919 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.258014 kubelet[3467]: E0527 17:20:51.257985 3467 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:20:51.258223 kubelet[3467]: W0527 17:20:51.258134 3467 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:20:51.258223 kubelet[3467]: E0527 17:20:51.258168 3467 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:20:51.312997 containerd[2002]: time="2025-05-27T17:20:51.312906350Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:51.315347 containerd[2002]: time="2025-05-27T17:20:51.315270398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4264304" May 27 17:20:51.317484 containerd[2002]: time="2025-05-27T17:20:51.317377850Z" level=info msg="ImageCreate event name:\"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:51.321371 containerd[2002]: time="2025-05-27T17:20:51.321286070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:51.323158 containerd[2002]: time="2025-05-27T17:20:51.322345622Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5633505\" in 1.259671507s" May 27 17:20:51.323158 containerd[2002]: time="2025-05-27T17:20:51.322401134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\"" May 27 17:20:51.331475 containerd[2002]: time="2025-05-27T17:20:51.331425194Z" level=info msg="CreateContainer within sandbox \"83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 17:20:51.350539 containerd[2002]: time="2025-05-27T17:20:51.350473046Z" level=info msg="Container a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:51.372502 containerd[2002]: time="2025-05-27T17:20:51.372450098Z" level=info msg="CreateContainer within sandbox \"83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9\"" May 27 17:20:51.375635 containerd[2002]: time="2025-05-27T17:20:51.375572450Z" level=info msg="StartContainer for \"a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9\"" May 27 17:20:51.380172 containerd[2002]: time="2025-05-27T17:20:51.380072090Z" level=info msg="connecting to shim a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9" address="unix:///run/containerd/s/6d97480859b37c937efcea624ccf3d660ccc981017c2ca54b01c2fcb8088f6e7" protocol=ttrpc version=3 May 27 17:20:51.423555 systemd[1]: Started cri-containerd-a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9.scope - libcontainer container a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9. May 27 17:20:51.501002 containerd[2002]: time="2025-05-27T17:20:51.500150330Z" level=info msg="StartContainer for \"a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9\" returns successfully" May 27 17:20:51.525071 systemd[1]: cri-containerd-a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9.scope: Deactivated successfully. May 27 17:20:51.532816 containerd[2002]: time="2025-05-27T17:20:51.532707879Z" level=info msg="received exit event container_id:\"a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9\" id:\"a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9\" pid:4162 exited_at:{seconds:1748366451 nanos:531819927}" May 27 17:20:51.533790 containerd[2002]: time="2025-05-27T17:20:51.533736207Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9\" id:\"a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9\" pid:4162 exited_at:{seconds:1748366451 nanos:531819927}" May 27 17:20:51.573092 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a5994399f15fdaa35f0ed62fe4e711bcf55046613d2dbf9bf67c9a6a75d23cc9-rootfs.mount: Deactivated successfully. May 27 17:20:52.184254 kubelet[3467]: I0527 17:20:52.184142 3467 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:20:52.189961 containerd[2002]: time="2025-05-27T17:20:52.189062954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 17:20:52.217338 kubelet[3467]: I0527 17:20:52.216816 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-564c76db59-xwvmv" podStartSLOduration=3.77046335 podStartE2EDuration="6.216795566s" podCreationTimestamp="2025-05-27 17:20:46 +0000 UTC" firstStartedPulling="2025-05-27 17:20:47.614835851 +0000 UTC m=+30.031715670" lastFinishedPulling="2025-05-27 17:20:50.061168055 +0000 UTC m=+32.478047886" observedRunningTime="2025-05-27 17:20:51.202591621 +0000 UTC m=+33.619471452" watchObservedRunningTime="2025-05-27 17:20:52.216795566 +0000 UTC m=+34.633675385" May 27 17:20:52.951036 kubelet[3467]: E0527 17:20:52.950962 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rhxb" podUID="0b9f82fa-5aa9-4828-ae1b-71591df99003" May 27 17:20:54.951821 kubelet[3467]: E0527 17:20:54.951543 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5rhxb" podUID="0b9f82fa-5aa9-4828-ae1b-71591df99003" May 27 17:20:55.158304 containerd[2002]: time="2025-05-27T17:20:55.157439201Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:55.159783 containerd[2002]: time="2025-05-27T17:20:55.159725501Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=65748976" May 27 17:20:55.161248 containerd[2002]: time="2025-05-27T17:20:55.161181341Z" level=info msg="ImageCreate event name:\"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:55.164687 containerd[2002]: time="2025-05-27T17:20:55.164623985Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:20:55.166218 containerd[2002]: time="2025-05-27T17:20:55.166162673Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"67118217\" in 2.977037379s" May 27 17:20:55.166345 containerd[2002]: time="2025-05-27T17:20:55.166215629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\"" May 27 17:20:55.173946 containerd[2002]: time="2025-05-27T17:20:55.173759873Z" level=info msg="CreateContainer within sandbox \"83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 17:20:55.188283 containerd[2002]: time="2025-05-27T17:20:55.187494953Z" level=info msg="Container baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485: CDI devices from CRI Config.CDIDevices: []" May 27 17:20:55.196267 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3462529710.mount: Deactivated successfully. May 27 17:20:55.211787 containerd[2002]: time="2025-05-27T17:20:55.210979901Z" level=info msg="CreateContainer within sandbox \"83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485\"" May 27 17:20:55.212504 containerd[2002]: time="2025-05-27T17:20:55.212428625Z" level=info msg="StartContainer for \"baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485\"" May 27 17:20:55.216008 containerd[2002]: time="2025-05-27T17:20:55.215872697Z" level=info msg="connecting to shim baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485" address="unix:///run/containerd/s/6d97480859b37c937efcea624ccf3d660ccc981017c2ca54b01c2fcb8088f6e7" protocol=ttrpc version=3 May 27 17:20:55.256547 systemd[1]: Started cri-containerd-baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485.scope - libcontainer container baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485. May 27 17:20:55.342303 containerd[2002]: time="2025-05-27T17:20:55.342196362Z" level=info msg="StartContainer for \"baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485\" returns successfully" May 27 17:20:56.223021 systemd[1]: cri-containerd-baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485.scope: Deactivated successfully. May 27 17:20:56.226412 systemd[1]: cri-containerd-baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485.scope: Consumed 900ms CPU time, 186.2M memory peak, 165.5M written to disk. May 27 17:20:56.230481 containerd[2002]: time="2025-05-27T17:20:56.230372514Z" level=info msg="received exit event container_id:\"baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485\" id:\"baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485\" pid:4225 exited_at:{seconds:1748366456 nanos:229748682}" May 27 17:20:56.231189 containerd[2002]: time="2025-05-27T17:20:56.230656242Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485\" id:\"baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485\" pid:4225 exited_at:{seconds:1748366456 nanos:229748682}" May 27 17:20:56.260206 kubelet[3467]: I0527 17:20:56.260129 3467 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 17:20:56.295078 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-baf5a349ba9c5d661403dd5a12c8f5a07742c6b56bff292c56e882d0e6779485-rootfs.mount: Deactivated successfully. May 27 17:20:56.393475 kubelet[3467]: I0527 17:20:56.388202 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/897829e3-eb94-4c94-8bd8-d6fd7e2f0124-config-volume\") pod \"coredns-674b8bbfcf-srvks\" (UID: \"897829e3-eb94-4c94-8bd8-d6fd7e2f0124\") " pod="kube-system/coredns-674b8bbfcf-srvks" May 27 17:20:56.393475 kubelet[3467]: I0527 17:20:56.391636 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9qsj\" (UniqueName: \"kubernetes.io/projected/897829e3-eb94-4c94-8bd8-d6fd7e2f0124-kube-api-access-n9qsj\") pod \"coredns-674b8bbfcf-srvks\" (UID: \"897829e3-eb94-4c94-8bd8-d6fd7e2f0124\") " pod="kube-system/coredns-674b8bbfcf-srvks" May 27 17:20:56.415383 systemd[1]: Created slice kubepods-burstable-pode542b2ab_32f6_486f_a8f7_2027e557168c.slice - libcontainer container kubepods-burstable-pode542b2ab_32f6_486f_a8f7_2027e557168c.slice. May 27 17:20:56.446655 systemd[1]: Created slice kubepods-besteffort-podee528c19_273e_40e5_843b_77df0ad9a5c2.slice - libcontainer container kubepods-besteffort-podee528c19_273e_40e5_843b_77df0ad9a5c2.slice. May 27 17:20:56.461260 systemd[1]: Created slice kubepods-burstable-pod897829e3_eb94_4c94_8bd8_d6fd7e2f0124.slice - libcontainer container kubepods-burstable-pod897829e3_eb94_4c94_8bd8_d6fd7e2f0124.slice. May 27 17:20:56.484060 systemd[1]: Created slice kubepods-besteffort-podeb001738_46fa_4acd_b2cc_f746f3102293.slice - libcontainer container kubepods-besteffort-podeb001738_46fa_4acd_b2cc_f746f3102293.slice. May 27 17:20:56.509357 kubelet[3467]: I0527 17:20:56.492545 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4hj\" (UniqueName: \"kubernetes.io/projected/e542b2ab-32f6-486f-a8f7-2027e557168c-kube-api-access-vt4hj\") pod \"coredns-674b8bbfcf-8r7td\" (UID: \"e542b2ab-32f6-486f-a8f7-2027e557168c\") " pod="kube-system/coredns-674b8bbfcf-8r7td" May 27 17:20:56.509357 kubelet[3467]: I0527 17:20:56.492701 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee528c19-273e-40e5-843b-77df0ad9a5c2-tigera-ca-bundle\") pod \"calico-kube-controllers-7b9788fff8-dw2zq\" (UID: \"ee528c19-273e-40e5-843b-77df0ad9a5c2\") " pod="calico-system/calico-kube-controllers-7b9788fff8-dw2zq" May 27 17:20:56.509357 kubelet[3467]: I0527 17:20:56.492770 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5cdt\" (UniqueName: \"kubernetes.io/projected/ee528c19-273e-40e5-843b-77df0ad9a5c2-kube-api-access-f5cdt\") pod \"calico-kube-controllers-7b9788fff8-dw2zq\" (UID: \"ee528c19-273e-40e5-843b-77df0ad9a5c2\") " pod="calico-system/calico-kube-controllers-7b9788fff8-dw2zq" May 27 17:20:56.509357 kubelet[3467]: I0527 17:20:56.492847 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e542b2ab-32f6-486f-a8f7-2027e557168c-config-volume\") pod \"coredns-674b8bbfcf-8r7td\" (UID: \"e542b2ab-32f6-486f-a8f7-2027e557168c\") " pod="kube-system/coredns-674b8bbfcf-8r7td" May 27 17:20:56.509357 kubelet[3467]: I0527 17:20:56.492900 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ab88850e-bafb-4e09-9926-6e07964ce97b-calico-apiserver-certs\") pod \"calico-apiserver-58f774c4bf-692vm\" (UID: \"ab88850e-bafb-4e09-9926-6e07964ce97b\") " pod="calico-apiserver/calico-apiserver-58f774c4bf-692vm" May 27 17:20:56.509691 kubelet[3467]: I0527 17:20:56.493014 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvz2\" (UniqueName: \"kubernetes.io/projected/ab88850e-bafb-4e09-9926-6e07964ce97b-kube-api-access-ptvz2\") pod \"calico-apiserver-58f774c4bf-692vm\" (UID: \"ab88850e-bafb-4e09-9926-6e07964ce97b\") " pod="calico-apiserver/calico-apiserver-58f774c4bf-692vm" May 27 17:20:56.556856 systemd[1]: Created slice kubepods-besteffort-podab88850e_bafb_4e09_9926_6e07964ce97b.slice - libcontainer container kubepods-besteffort-podab88850e_bafb_4e09_9926_6e07964ce97b.slice. May 27 17:20:56.593469 kubelet[3467]: I0527 17:20:56.593424 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-backend-key-pair\") pod \"whisker-5cc64d8466-gcgh9\" (UID: \"eb001738-46fa-4acd-b2cc-f746f3102293\") " pod="calico-system/whisker-5cc64d8466-gcgh9" May 27 17:20:56.596138 kubelet[3467]: I0527 17:20:56.595553 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmdd\" (UniqueName: \"kubernetes.io/projected/eb001738-46fa-4acd-b2cc-f746f3102293-kube-api-access-kzmdd\") pod \"whisker-5cc64d8466-gcgh9\" (UID: \"eb001738-46fa-4acd-b2cc-f746f3102293\") " pod="calico-system/whisker-5cc64d8466-gcgh9" May 27 17:20:56.596138 kubelet[3467]: I0527 17:20:56.595626 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/46b9e02b-c554-45f6-ba93-5fe36b0b7377-calico-apiserver-certs\") pod \"calico-apiserver-58f774c4bf-tszqg\" (UID: \"46b9e02b-c554-45f6-ba93-5fe36b0b7377\") " pod="calico-apiserver/calico-apiserver-58f774c4bf-tszqg" May 27 17:20:56.596138 kubelet[3467]: I0527 17:20:56.595664 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjzq\" (UniqueName: \"kubernetes.io/projected/46b9e02b-c554-45f6-ba93-5fe36b0b7377-kube-api-access-6rjzq\") pod \"calico-apiserver-58f774c4bf-tszqg\" (UID: \"46b9e02b-c554-45f6-ba93-5fe36b0b7377\") " pod="calico-apiserver/calico-apiserver-58f774c4bf-tszqg" May 27 17:20:56.596138 kubelet[3467]: I0527 17:20:56.595708 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-ca-bundle\") pod \"whisker-5cc64d8466-gcgh9\" (UID: \"eb001738-46fa-4acd-b2cc-f746f3102293\") " pod="calico-system/whisker-5cc64d8466-gcgh9" May 27 17:20:56.641032 systemd[1]: Created slice kubepods-besteffort-podbbacbf79_f2c8_4feb_9c01_4ff16f7741d3.slice - libcontainer container kubepods-besteffort-podbbacbf79_f2c8_4feb_9c01_4ff16f7741d3.slice. May 27 17:20:56.648913 systemd[1]: Created slice kubepods-besteffort-pod46b9e02b_c554_45f6_ba93_5fe36b0b7377.slice - libcontainer container kubepods-besteffort-pod46b9e02b_c554_45f6_ba93_5fe36b0b7377.slice. May 27 17:20:56.696708 kubelet[3467]: I0527 17:20:56.696637 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/bbacbf79-f2c8-4feb-9c01-4ff16f7741d3-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-5lg4z\" (UID: \"bbacbf79-f2c8-4feb-9c01-4ff16f7741d3\") " pod="calico-system/goldmane-78d55f7ddc-5lg4z" May 27 17:20:56.697127 kubelet[3467]: I0527 17:20:56.697030 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbacbf79-f2c8-4feb-9c01-4ff16f7741d3-config\") pod \"goldmane-78d55f7ddc-5lg4z\" (UID: \"bbacbf79-f2c8-4feb-9c01-4ff16f7741d3\") " pod="calico-system/goldmane-78d55f7ddc-5lg4z" May 27 17:20:56.697487 kubelet[3467]: I0527 17:20:56.697392 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbacbf79-f2c8-4feb-9c01-4ff16f7741d3-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-5lg4z\" (UID: \"bbacbf79-f2c8-4feb-9c01-4ff16f7741d3\") " pod="calico-system/goldmane-78d55f7ddc-5lg4z" May 27 17:20:56.697672 kubelet[3467]: I0527 17:20:56.697622 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkjj2\" (UniqueName: \"kubernetes.io/projected/bbacbf79-f2c8-4feb-9c01-4ff16f7741d3-kube-api-access-vkjj2\") pod \"goldmane-78d55f7ddc-5lg4z\" (UID: \"bbacbf79-f2c8-4feb-9c01-4ff16f7741d3\") " pod="calico-system/goldmane-78d55f7ddc-5lg4z" May 27 17:20:56.751120 containerd[2002]: time="2025-05-27T17:20:56.751038369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8r7td,Uid:e542b2ab-32f6-486f-a8f7-2027e557168c,Namespace:kube-system,Attempt:0,}" May 27 17:20:56.761986 containerd[2002]: time="2025-05-27T17:20:56.761629413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9788fff8-dw2zq,Uid:ee528c19-273e-40e5-843b-77df0ad9a5c2,Namespace:calico-system,Attempt:0,}" May 27 17:20:56.772471 containerd[2002]: time="2025-05-27T17:20:56.772348833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-srvks,Uid:897829e3-eb94-4c94-8bd8-d6fd7e2f0124,Namespace:kube-system,Attempt:0,}" May 27 17:20:56.811321 containerd[2002]: time="2025-05-27T17:20:56.811204869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cc64d8466-gcgh9,Uid:eb001738-46fa-4acd-b2cc-f746f3102293,Namespace:calico-system,Attempt:0,}" May 27 17:20:56.883633 containerd[2002]: time="2025-05-27T17:20:56.883560669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-692vm,Uid:ab88850e-bafb-4e09-9926-6e07964ce97b,Namespace:calico-apiserver,Attempt:0,}" May 27 17:20:56.967745 systemd[1]: Created slice kubepods-besteffort-pod0b9f82fa_5aa9_4828_ae1b_71591df99003.slice - libcontainer container kubepods-besteffort-pod0b9f82fa_5aa9_4828_ae1b_71591df99003.slice. May 27 17:20:56.977072 containerd[2002]: time="2025-05-27T17:20:56.977002846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5rhxb,Uid:0b9f82fa-5aa9-4828-ae1b-71591df99003,Namespace:calico-system,Attempt:0,}" May 27 17:20:56.993593 containerd[2002]: time="2025-05-27T17:20:56.988760326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-5lg4z,Uid:bbacbf79-f2c8-4feb-9c01-4ff16f7741d3,Namespace:calico-system,Attempt:0,}" May 27 17:20:56.997159 containerd[2002]: time="2025-05-27T17:20:56.997103554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-tszqg,Uid:46b9e02b-c554-45f6-ba93-5fe36b0b7377,Namespace:calico-apiserver,Attempt:0,}" May 27 17:20:57.255532 containerd[2002]: time="2025-05-27T17:20:57.255473215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 17:20:57.440549 containerd[2002]: time="2025-05-27T17:20:57.440446184Z" level=error msg="Failed to destroy network for sandbox \"43a3677e4364a25dace51839a4e6a03dfecd69918bac00be26a3dbf61cd7c615\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.447252 systemd[1]: run-netns-cni\x2d8094ebed\x2d702b\x2d389e\x2df8b8\x2d1ab64965e4d7.mount: Deactivated successfully. May 27 17:20:57.455516 containerd[2002]: time="2025-05-27T17:20:57.455349416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-692vm,Uid:ab88850e-bafb-4e09-9926-6e07964ce97b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a3677e4364a25dace51839a4e6a03dfecd69918bac00be26a3dbf61cd7c615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.456360 containerd[2002]: time="2025-05-27T17:20:57.455902364Z" level=error msg="Failed to destroy network for sandbox \"9f1969d9a4b3e6ba71576049d0fd38da6ef82b6b203ed2249f86304452f3d205\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.456469 kubelet[3467]: E0527 17:20:57.456130 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a3677e4364a25dace51839a4e6a03dfecd69918bac00be26a3dbf61cd7c615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.458351 kubelet[3467]: E0527 17:20:57.456217 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a3677e4364a25dace51839a4e6a03dfecd69918bac00be26a3dbf61cd7c615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58f774c4bf-692vm" May 27 17:20:57.458351 kubelet[3467]: E0527 17:20:57.457787 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a3677e4364a25dace51839a4e6a03dfecd69918bac00be26a3dbf61cd7c615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58f774c4bf-692vm" May 27 17:20:57.458351 kubelet[3467]: E0527 17:20:57.457956 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58f774c4bf-692vm_calico-apiserver(ab88850e-bafb-4e09-9926-6e07964ce97b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58f774c4bf-692vm_calico-apiserver(ab88850e-bafb-4e09-9926-6e07964ce97b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43a3677e4364a25dace51839a4e6a03dfecd69918bac00be26a3dbf61cd7c615\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-58f774c4bf-692vm" podUID="ab88850e-bafb-4e09-9926-6e07964ce97b" May 27 17:20:57.469996 systemd[1]: run-netns-cni\x2d2cb0d3f0\x2d5d21\x2d0cf6\x2d34d7\x2d102667ecd45f.mount: Deactivated successfully. May 27 17:20:57.473519 containerd[2002]: time="2025-05-27T17:20:57.473429336Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5rhxb,Uid:0b9f82fa-5aa9-4828-ae1b-71591df99003,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f1969d9a4b3e6ba71576049d0fd38da6ef82b6b203ed2249f86304452f3d205\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.475628 kubelet[3467]: E0527 17:20:57.475535 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f1969d9a4b3e6ba71576049d0fd38da6ef82b6b203ed2249f86304452f3d205\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.476204 kubelet[3467]: E0527 17:20:57.475636 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f1969d9a4b3e6ba71576049d0fd38da6ef82b6b203ed2249f86304452f3d205\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5rhxb" May 27 17:20:57.476204 kubelet[3467]: E0527 17:20:57.475673 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f1969d9a4b3e6ba71576049d0fd38da6ef82b6b203ed2249f86304452f3d205\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5rhxb" May 27 17:20:57.476204 kubelet[3467]: E0527 17:20:57.475763 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5rhxb_calico-system(0b9f82fa-5aa9-4828-ae1b-71591df99003)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5rhxb_calico-system(0b9f82fa-5aa9-4828-ae1b-71591df99003)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9f1969d9a4b3e6ba71576049d0fd38da6ef82b6b203ed2249f86304452f3d205\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5rhxb" podUID="0b9f82fa-5aa9-4828-ae1b-71591df99003" May 27 17:20:57.497750 containerd[2002]: time="2025-05-27T17:20:57.497676092Z" level=error msg="Failed to destroy network for sandbox \"11c01140d1f3d4423e7f990d1669d1ef6753bcd24576814e60edb8875f595e8c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.501270 containerd[2002]: time="2025-05-27T17:20:57.500483348Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cc64d8466-gcgh9,Uid:eb001738-46fa-4acd-b2cc-f746f3102293,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"11c01140d1f3d4423e7f990d1669d1ef6753bcd24576814e60edb8875f595e8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.504940 kubelet[3467]: E0527 17:20:57.503573 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11c01140d1f3d4423e7f990d1669d1ef6753bcd24576814e60edb8875f595e8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.504940 kubelet[3467]: E0527 17:20:57.503683 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11c01140d1f3d4423e7f990d1669d1ef6753bcd24576814e60edb8875f595e8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5cc64d8466-gcgh9" May 27 17:20:57.504940 kubelet[3467]: E0527 17:20:57.504263 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11c01140d1f3d4423e7f990d1669d1ef6753bcd24576814e60edb8875f595e8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5cc64d8466-gcgh9" May 27 17:20:57.503753 systemd[1]: run-netns-cni\x2d2d518838\x2dc7eb\x2d07c0\x2d3c7a\x2df7576a2c654a.mount: Deactivated successfully. May 27 17:20:57.505351 kubelet[3467]: E0527 17:20:57.504418 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5cc64d8466-gcgh9_calico-system(eb001738-46fa-4acd-b2cc-f746f3102293)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5cc64d8466-gcgh9_calico-system(eb001738-46fa-4acd-b2cc-f746f3102293)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"11c01140d1f3d4423e7f990d1669d1ef6753bcd24576814e60edb8875f595e8c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5cc64d8466-gcgh9" podUID="eb001738-46fa-4acd-b2cc-f746f3102293" May 27 17:20:57.506870 containerd[2002]: time="2025-05-27T17:20:57.505753376Z" level=error msg="Failed to destroy network for sandbox \"fce8fb0fe25d087ad0a517777a34035c7eb01e0ab336b8947ca6d52c9cacd5ac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.514222 containerd[2002]: time="2025-05-27T17:20:57.513773876Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8r7td,Uid:e542b2ab-32f6-486f-a8f7-2027e557168c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce8fb0fe25d087ad0a517777a34035c7eb01e0ab336b8947ca6d52c9cacd5ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.515721 kubelet[3467]: E0527 17:20:57.515502 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce8fb0fe25d087ad0a517777a34035c7eb01e0ab336b8947ca6d52c9cacd5ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.515721 kubelet[3467]: E0527 17:20:57.515632 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce8fb0fe25d087ad0a517777a34035c7eb01e0ab336b8947ca6d52c9cacd5ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-8r7td" May 27 17:20:57.515721 kubelet[3467]: E0527 17:20:57.515671 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce8fb0fe25d087ad0a517777a34035c7eb01e0ab336b8947ca6d52c9cacd5ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-8r7td" May 27 17:20:57.516335 kubelet[3467]: E0527 17:20:57.516272 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-8r7td_kube-system(e542b2ab-32f6-486f-a8f7-2027e557168c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-8r7td_kube-system(e542b2ab-32f6-486f-a8f7-2027e557168c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fce8fb0fe25d087ad0a517777a34035c7eb01e0ab336b8947ca6d52c9cacd5ac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-8r7td" podUID="e542b2ab-32f6-486f-a8f7-2027e557168c" May 27 17:20:57.516667 systemd[1]: run-netns-cni\x2d6ab892e1\x2d5560\x2d8fdf\x2de079\x2dba475c3caa3a.mount: Deactivated successfully. May 27 17:20:57.520782 containerd[2002]: time="2025-05-27T17:20:57.520617188Z" level=error msg="Failed to destroy network for sandbox \"c06877fb87ceed70d9c9c74927f16e1de8fd4b79fbd961138d0bb6981e658c45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.521760 containerd[2002]: time="2025-05-27T17:20:57.521694884Z" level=error msg="Failed to destroy network for sandbox \"cf3b6bb4226d66cbb3258a58b9eb62ab8cc5dc0add1940f94ce02cdfe178b220\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.529001 containerd[2002]: time="2025-05-27T17:20:57.527599556Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-srvks,Uid:897829e3-eb94-4c94-8bd8-d6fd7e2f0124,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c06877fb87ceed70d9c9c74927f16e1de8fd4b79fbd961138d0bb6981e658c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.529219 kubelet[3467]: E0527 17:20:57.529109 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c06877fb87ceed70d9c9c74927f16e1de8fd4b79fbd961138d0bb6981e658c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.529219 kubelet[3467]: E0527 17:20:57.529183 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c06877fb87ceed70d9c9c74927f16e1de8fd4b79fbd961138d0bb6981e658c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-srvks" May 27 17:20:57.530906 kubelet[3467]: E0527 17:20:57.530284 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c06877fb87ceed70d9c9c74927f16e1de8fd4b79fbd961138d0bb6981e658c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-srvks" May 27 17:20:57.530906 kubelet[3467]: E0527 17:20:57.530396 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-srvks_kube-system(897829e3-eb94-4c94-8bd8-d6fd7e2f0124)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-srvks_kube-system(897829e3-eb94-4c94-8bd8-d6fd7e2f0124)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c06877fb87ceed70d9c9c74927f16e1de8fd4b79fbd961138d0bb6981e658c45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-srvks" podUID="897829e3-eb94-4c94-8bd8-d6fd7e2f0124" May 27 17:20:57.531486 containerd[2002]: time="2025-05-27T17:20:57.530703512Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-5lg4z,Uid:bbacbf79-f2c8-4feb-9c01-4ff16f7741d3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf3b6bb4226d66cbb3258a58b9eb62ab8cc5dc0add1940f94ce02cdfe178b220\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.533766 kubelet[3467]: E0527 17:20:57.531108 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf3b6bb4226d66cbb3258a58b9eb62ab8cc5dc0add1940f94ce02cdfe178b220\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.533766 kubelet[3467]: E0527 17:20:57.531357 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf3b6bb4226d66cbb3258a58b9eb62ab8cc5dc0add1940f94ce02cdfe178b220\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-5lg4z" May 27 17:20:57.533766 kubelet[3467]: E0527 17:20:57.531394 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf3b6bb4226d66cbb3258a58b9eb62ab8cc5dc0add1940f94ce02cdfe178b220\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-5lg4z" May 27 17:20:57.533976 kubelet[3467]: E0527 17:20:57.531465 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-5lg4z_calico-system(bbacbf79-f2c8-4feb-9c01-4ff16f7741d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-5lg4z_calico-system(bbacbf79-f2c8-4feb-9c01-4ff16f7741d3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cf3b6bb4226d66cbb3258a58b9eb62ab8cc5dc0add1940f94ce02cdfe178b220\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:20:57.536103 containerd[2002]: time="2025-05-27T17:20:57.536028608Z" level=error msg="Failed to destroy network for sandbox \"66a738df0159d1fc394c43e554bc1fc5081c5e1d1b26c329f59fcee5f8eada56\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.539211 containerd[2002]: time="2025-05-27T17:20:57.538835924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9788fff8-dw2zq,Uid:ee528c19-273e-40e5-843b-77df0ad9a5c2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"66a738df0159d1fc394c43e554bc1fc5081c5e1d1b26c329f59fcee5f8eada56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.541407 kubelet[3467]: E0527 17:20:57.540615 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66a738df0159d1fc394c43e554bc1fc5081c5e1d1b26c329f59fcee5f8eada56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.541407 kubelet[3467]: E0527 17:20:57.540692 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66a738df0159d1fc394c43e554bc1fc5081c5e1d1b26c329f59fcee5f8eada56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9788fff8-dw2zq" May 27 17:20:57.541407 kubelet[3467]: E0527 17:20:57.540727 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66a738df0159d1fc394c43e554bc1fc5081c5e1d1b26c329f59fcee5f8eada56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b9788fff8-dw2zq" May 27 17:20:57.541673 kubelet[3467]: E0527 17:20:57.540801 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b9788fff8-dw2zq_calico-system(ee528c19-273e-40e5-843b-77df0ad9a5c2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b9788fff8-dw2zq_calico-system(ee528c19-273e-40e5-843b-77df0ad9a5c2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"66a738df0159d1fc394c43e554bc1fc5081c5e1d1b26c329f59fcee5f8eada56\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b9788fff8-dw2zq" podUID="ee528c19-273e-40e5-843b-77df0ad9a5c2" May 27 17:20:57.549366 containerd[2002]: time="2025-05-27T17:20:57.549178173Z" level=error msg="Failed to destroy network for sandbox \"d3e520de9491f9984edeed0f58df85235f9346a7cce0658c3614c4bba0cfcb6d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.551953 containerd[2002]: time="2025-05-27T17:20:57.551764749Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-tszqg,Uid:46b9e02b-c554-45f6-ba93-5fe36b0b7377,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3e520de9491f9984edeed0f58df85235f9346a7cce0658c3614c4bba0cfcb6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.552619 kubelet[3467]: E0527 17:20:57.552564 3467 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3e520de9491f9984edeed0f58df85235f9346a7cce0658c3614c4bba0cfcb6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:20:57.552737 kubelet[3467]: E0527 17:20:57.552646 3467 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3e520de9491f9984edeed0f58df85235f9346a7cce0658c3614c4bba0cfcb6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58f774c4bf-tszqg" May 27 17:20:57.552737 kubelet[3467]: E0527 17:20:57.552700 3467 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3e520de9491f9984edeed0f58df85235f9346a7cce0658c3614c4bba0cfcb6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58f774c4bf-tszqg" May 27 17:20:57.552861 kubelet[3467]: E0527 17:20:57.552794 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58f774c4bf-tszqg_calico-apiserver(46b9e02b-c554-45f6-ba93-5fe36b0b7377)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58f774c4bf-tszqg_calico-apiserver(46b9e02b-c554-45f6-ba93-5fe36b0b7377)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d3e520de9491f9984edeed0f58df85235f9346a7cce0658c3614c4bba0cfcb6d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-58f774c4bf-tszqg" podUID="46b9e02b-c554-45f6-ba93-5fe36b0b7377" May 27 17:20:58.293276 systemd[1]: run-netns-cni\x2dce0a30b4\x2d2a34\x2de2f1\x2df069\x2d14e63b70b839.mount: Deactivated successfully. May 27 17:20:58.293463 systemd[1]: run-netns-cni\x2d46d68f7a\x2d7cff\x2deef2\x2d167d\x2d9c27c2a07446.mount: Deactivated successfully. May 27 17:20:58.293590 systemd[1]: run-netns-cni\x2d0ff0445c\x2d4d99\x2df7fe\x2dc4f1\x2dd3f9b2dcc9f2.mount: Deactivated successfully. May 27 17:20:58.293716 systemd[1]: run-netns-cni\x2d1dae645f\x2d4223\x2d72a0\x2dfb6e\x2db3cf19de4bba.mount: Deactivated successfully. May 27 17:21:00.515908 kubelet[3467]: I0527 17:21:00.515459 3467 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:21:03.516142 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1491420001.mount: Deactivated successfully. May 27 17:21:03.560953 containerd[2002]: time="2025-05-27T17:21:03.560886206Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:03.562525 containerd[2002]: time="2025-05-27T17:21:03.562473746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=150465379" May 27 17:21:03.565263 containerd[2002]: time="2025-05-27T17:21:03.564098846Z" level=info msg="ImageCreate event name:\"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:03.567554 containerd[2002]: time="2025-05-27T17:21:03.567455294Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:03.569040 containerd[2002]: time="2025-05-27T17:21:03.568993070Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"150465241\" in 6.312693355s" May 27 17:21:03.569220 containerd[2002]: time="2025-05-27T17:21:03.569190674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\"" May 27 17:21:03.617572 containerd[2002]: time="2025-05-27T17:21:03.617513079Z" level=info msg="CreateContainer within sandbox \"83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 17:21:03.634623 containerd[2002]: time="2025-05-27T17:21:03.634558143Z" level=info msg="Container bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:03.656155 containerd[2002]: time="2025-05-27T17:21:03.656078583Z" level=info msg="CreateContainer within sandbox \"83b3a950cb9e5108b0bba2cb297e7fb3d9af37dcb5cd5396a5ab0ed77d411a35\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\"" May 27 17:21:03.657687 containerd[2002]: time="2025-05-27T17:21:03.657637443Z" level=info msg="StartContainer for \"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\"" May 27 17:21:03.661191 containerd[2002]: time="2025-05-27T17:21:03.661066563Z" level=info msg="connecting to shim bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278" address="unix:///run/containerd/s/6d97480859b37c937efcea624ccf3d660ccc981017c2ca54b01c2fcb8088f6e7" protocol=ttrpc version=3 May 27 17:21:03.701543 systemd[1]: Started cri-containerd-bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278.scope - libcontainer container bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278. May 27 17:21:03.807079 containerd[2002]: time="2025-05-27T17:21:03.806972932Z" level=info msg="StartContainer for \"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\" returns successfully" May 27 17:21:03.949075 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 17:21:03.949214 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 17:21:04.262734 kubelet[3467]: I0527 17:21:04.262318 3467 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-backend-key-pair\") pod \"eb001738-46fa-4acd-b2cc-f746f3102293\" (UID: \"eb001738-46fa-4acd-b2cc-f746f3102293\") " May 27 17:21:04.266080 kubelet[3467]: I0527 17:21:04.263826 3467 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzmdd\" (UniqueName: \"kubernetes.io/projected/eb001738-46fa-4acd-b2cc-f746f3102293-kube-api-access-kzmdd\") pod \"eb001738-46fa-4acd-b2cc-f746f3102293\" (UID: \"eb001738-46fa-4acd-b2cc-f746f3102293\") " May 27 17:21:04.266080 kubelet[3467]: I0527 17:21:04.263889 3467 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-ca-bundle\") pod \"eb001738-46fa-4acd-b2cc-f746f3102293\" (UID: \"eb001738-46fa-4acd-b2cc-f746f3102293\") " May 27 17:21:04.266080 kubelet[3467]: I0527 17:21:04.264626 3467 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "eb001738-46fa-4acd-b2cc-f746f3102293" (UID: "eb001738-46fa-4acd-b2cc-f746f3102293"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 17:21:04.271550 kubelet[3467]: I0527 17:21:04.271467 3467 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "eb001738-46fa-4acd-b2cc-f746f3102293" (UID: "eb001738-46fa-4acd-b2cc-f746f3102293"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 17:21:04.276785 kubelet[3467]: I0527 17:21:04.276729 3467 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb001738-46fa-4acd-b2cc-f746f3102293-kube-api-access-kzmdd" (OuterVolumeSpecName: "kube-api-access-kzmdd") pod "eb001738-46fa-4acd-b2cc-f746f3102293" (UID: "eb001738-46fa-4acd-b2cc-f746f3102293"). InnerVolumeSpecName "kube-api-access-kzmdd". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 17:21:04.314963 systemd[1]: Removed slice kubepods-besteffort-podeb001738_46fa_4acd_b2cc_f746f3102293.slice - libcontainer container kubepods-besteffort-podeb001738_46fa_4acd_b2cc_f746f3102293.slice. May 27 17:21:04.343639 kubelet[3467]: I0527 17:21:04.343443 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-44h7h" podStartSLOduration=1.587844228 podStartE2EDuration="17.34341881s" podCreationTimestamp="2025-05-27 17:20:47 +0000 UTC" firstStartedPulling="2025-05-27 17:20:47.814618692 +0000 UTC m=+30.231498523" lastFinishedPulling="2025-05-27 17:21:03.570193298 +0000 UTC m=+45.987073105" observedRunningTime="2025-05-27 17:21:04.341767238 +0000 UTC m=+46.758647069" watchObservedRunningTime="2025-05-27 17:21:04.34341881 +0000 UTC m=+46.760298629" May 27 17:21:04.364644 kubelet[3467]: I0527 17:21:04.364573 3467 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-ca-bundle\") on node \"ip-172-31-16-30\" DevicePath \"\"" May 27 17:21:04.364644 kubelet[3467]: I0527 17:21:04.364627 3467 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/eb001738-46fa-4acd-b2cc-f746f3102293-whisker-backend-key-pair\") on node \"ip-172-31-16-30\" DevicePath \"\"" May 27 17:21:04.364644 kubelet[3467]: I0527 17:21:04.364652 3467 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzmdd\" (UniqueName: \"kubernetes.io/projected/eb001738-46fa-4acd-b2cc-f746f3102293-kube-api-access-kzmdd\") on node \"ip-172-31-16-30\" DevicePath \"\"" May 27 17:21:04.480302 systemd[1]: Created slice kubepods-besteffort-pod07c2d869_b946_458d_a95a_719eae16bc54.slice - libcontainer container kubepods-besteffort-pod07c2d869_b946_458d_a95a_719eae16bc54.slice. May 27 17:21:04.517534 systemd[1]: var-lib-kubelet-pods-eb001738\x2d46fa\x2d4acd\x2db2cc\x2df746f3102293-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkzmdd.mount: Deactivated successfully. May 27 17:21:04.518272 systemd[1]: var-lib-kubelet-pods-eb001738\x2d46fa\x2d4acd\x2db2cc\x2df746f3102293-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 17:21:04.565762 kubelet[3467]: I0527 17:21:04.565643 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-678pc\" (UniqueName: \"kubernetes.io/projected/07c2d869-b946-458d-a95a-719eae16bc54-kube-api-access-678pc\") pod \"whisker-79c644867b-j6zz7\" (UID: \"07c2d869-b946-458d-a95a-719eae16bc54\") " pod="calico-system/whisker-79c644867b-j6zz7" May 27 17:21:04.565762 kubelet[3467]: I0527 17:21:04.565734 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/07c2d869-b946-458d-a95a-719eae16bc54-whisker-backend-key-pair\") pod \"whisker-79c644867b-j6zz7\" (UID: \"07c2d869-b946-458d-a95a-719eae16bc54\") " pod="calico-system/whisker-79c644867b-j6zz7" May 27 17:21:04.567075 kubelet[3467]: I0527 17:21:04.565782 3467 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07c2d869-b946-458d-a95a-719eae16bc54-whisker-ca-bundle\") pod \"whisker-79c644867b-j6zz7\" (UID: \"07c2d869-b946-458d-a95a-719eae16bc54\") " pod="calico-system/whisker-79c644867b-j6zz7" May 27 17:21:04.715977 containerd[2002]: time="2025-05-27T17:21:04.715822336Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\" id:\"047af541e74905aa7c659ce75380a832bf0496d5b2dea32954aca2ab2f73da8b\" pid:4550 exit_status:1 exited_at:{seconds:1748366464 nanos:714824284}" May 27 17:21:04.792724 containerd[2002]: time="2025-05-27T17:21:04.791626696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79c644867b-j6zz7,Uid:07c2d869-b946-458d-a95a-719eae16bc54,Namespace:calico-system,Attempt:0,}" May 27 17:21:05.060605 (udev-worker)[4526]: Network interface NamePolicy= disabled on kernel command line. May 27 17:21:05.065965 systemd-networkd[1901]: cali2d02cb2bcff: Link UP May 27 17:21:05.067821 systemd-networkd[1901]: cali2d02cb2bcff: Gained carrier May 27 17:21:05.105101 containerd[2002]: 2025-05-27 17:21:04.850 [INFO][4568] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:21:05.105101 containerd[2002]: 2025-05-27 17:21:04.925 [INFO][4568] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0 whisker-79c644867b- calico-system 07c2d869-b946-458d-a95a-719eae16bc54 926 0 2025-05-27 17:21:04 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79c644867b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-16-30 whisker-79c644867b-j6zz7 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2d02cb2bcff [] [] }} ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-" May 27 17:21:05.105101 containerd[2002]: 2025-05-27 17:21:04.925 [INFO][4568] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" May 27 17:21:05.105101 containerd[2002]: 2025-05-27 17:21:04.981 [INFO][4588] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" HandleID="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Workload="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:04.981 [INFO][4588] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" HandleID="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Workload="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cd020), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-16-30", "pod":"whisker-79c644867b-j6zz7", "timestamp":"2025-05-27 17:21:04.981540425 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:04.981 [INFO][4588] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:04.982 [INFO][4588] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:04.982 [INFO][4588] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:04.997 [INFO][4588] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" host="ip-172-31-16-30" May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:05.005 [INFO][4588] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:05.014 [INFO][4588] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:05.017 [INFO][4588] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:05.021 [INFO][4588] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:05.106242 containerd[2002]: 2025-05-27 17:21:05.021 [INFO][4588] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" host="ip-172-31-16-30" May 27 17:21:05.106861 containerd[2002]: 2025-05-27 17:21:05.024 [INFO][4588] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62 May 27 17:21:05.106861 containerd[2002]: 2025-05-27 17:21:05.032 [INFO][4588] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" host="ip-172-31-16-30" May 27 17:21:05.106861 containerd[2002]: 2025-05-27 17:21:05.040 [INFO][4588] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.65/26] block=192.168.44.64/26 handle="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" host="ip-172-31-16-30" May 27 17:21:05.106861 containerd[2002]: 2025-05-27 17:21:05.040 [INFO][4588] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.65/26] handle="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" host="ip-172-31-16-30" May 27 17:21:05.106861 containerd[2002]: 2025-05-27 17:21:05.040 [INFO][4588] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:05.106861 containerd[2002]: 2025-05-27 17:21:05.040 [INFO][4588] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.65/26] IPv6=[] ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" HandleID="k8s-pod-network.00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Workload="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" May 27 17:21:05.107162 containerd[2002]: 2025-05-27 17:21:05.047 [INFO][4568] cni-plugin/k8s.go 418: Populated endpoint ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0", GenerateName:"whisker-79c644867b-", Namespace:"calico-system", SelfLink:"", UID:"07c2d869-b946-458d-a95a-719eae16bc54", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 21, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79c644867b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"whisker-79c644867b-j6zz7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.44.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2d02cb2bcff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:05.107162 containerd[2002]: 2025-05-27 17:21:05.048 [INFO][4568] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.65/32] ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" May 27 17:21:05.107375 containerd[2002]: 2025-05-27 17:21:05.048 [INFO][4568] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d02cb2bcff ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" May 27 17:21:05.107375 containerd[2002]: 2025-05-27 17:21:05.068 [INFO][4568] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" May 27 17:21:05.107476 containerd[2002]: 2025-05-27 17:21:05.069 [INFO][4568] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0", GenerateName:"whisker-79c644867b-", Namespace:"calico-system", SelfLink:"", UID:"07c2d869-b946-458d-a95a-719eae16bc54", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 21, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79c644867b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62", Pod:"whisker-79c644867b-j6zz7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.44.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2d02cb2bcff", MAC:"6e:27:ef:bb:a2:27", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:05.107597 containerd[2002]: 2025-05-27 17:21:05.097 [INFO][4568] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" Namespace="calico-system" Pod="whisker-79c644867b-j6zz7" WorkloadEndpoint="ip--172--31--16--30-k8s-whisker--79c644867b--j6zz7-eth0" May 27 17:21:05.141463 containerd[2002]: time="2025-05-27T17:21:05.141322778Z" level=info msg="connecting to shim 00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62" address="unix:///run/containerd/s/624f67efcb8b5749749c689ef56cbe335a8c71bba7f50a4e9f0130c318461038" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:05.183542 systemd[1]: Started cri-containerd-00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62.scope - libcontainer container 00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62. May 27 17:21:05.257389 containerd[2002]: time="2025-05-27T17:21:05.257206263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79c644867b-j6zz7,Uid:07c2d869-b946-458d-a95a-719eae16bc54,Namespace:calico-system,Attempt:0,} returns sandbox id \"00bb9e5a44cb37040690472820cfa15566c634164e5cd42f4d04555488796c62\"" May 27 17:21:05.262758 containerd[2002]: time="2025-05-27T17:21:05.262638075Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:21:05.441981 containerd[2002]: time="2025-05-27T17:21:05.441635584Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\" id:\"f09cd8d9e9faec220b591ea215fb418252f53b51fbe05e60b9440699cd8a0718\" pid:4659 exit_status:1 exited_at:{seconds:1748366465 nanos:441190144}" May 27 17:21:05.444461 containerd[2002]: time="2025-05-27T17:21:05.444375436Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:05.448016 containerd[2002]: time="2025-05-27T17:21:05.447935548Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:05.448634 containerd[2002]: time="2025-05-27T17:21:05.447969772Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:21:05.448722 kubelet[3467]: E0527 17:21:05.448266 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:21:05.448722 kubelet[3467]: E0527 17:21:05.448325 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:21:05.450726 kubelet[3467]: E0527 17:21:05.450645 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:5deabb2af4d14acab0df7cb3df816526,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:05.454427 containerd[2002]: time="2025-05-27T17:21:05.454374904Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:21:05.658763 containerd[2002]: time="2025-05-27T17:21:05.658661633Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:05.659924 containerd[2002]: time="2025-05-27T17:21:05.659819765Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:05.659924 containerd[2002]: time="2025-05-27T17:21:05.659884421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:21:05.660930 kubelet[3467]: E0527 17:21:05.660853 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:21:05.661362 kubelet[3467]: E0527 17:21:05.660930 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:21:05.661468 kubelet[3467]: E0527 17:21:05.661111 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:05.663300 kubelet[3467]: E0527 17:21:05.662797 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:21:05.959865 kubelet[3467]: I0527 17:21:05.959787 3467 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb001738-46fa-4acd-b2cc-f746f3102293" path="/var/lib/kubelet/pods/eb001738-46fa-4acd-b2cc-f746f3102293/volumes" May 27 17:21:06.304538 kubelet[3467]: E0527 17:21:06.304458 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:21:06.723637 systemd-networkd[1901]: cali2d02cb2bcff: Gained IPv6LL May 27 17:21:06.931772 systemd-networkd[1901]: vxlan.calico: Link UP May 27 17:21:06.931796 systemd-networkd[1901]: vxlan.calico: Gained carrier May 27 17:21:06.979792 (udev-worker)[4521]: Network interface NamePolicy= disabled on kernel command line. May 27 17:21:07.953861 containerd[2002]: time="2025-05-27T17:21:07.953510648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-5lg4z,Uid:bbacbf79-f2c8-4feb-9c01-4ff16f7741d3,Namespace:calico-system,Attempt:0,}" May 27 17:21:08.003705 systemd-networkd[1901]: vxlan.calico: Gained IPv6LL May 27 17:21:08.194026 systemd-networkd[1901]: cali82df9c8dd66: Link UP May 27 17:21:08.196149 systemd-networkd[1901]: cali82df9c8dd66: Gained carrier May 27 17:21:08.223671 containerd[2002]: 2025-05-27 17:21:08.046 [INFO][4863] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0 goldmane-78d55f7ddc- calico-system bbacbf79-f2c8-4feb-9c01-4ff16f7741d3 865 0 2025-05-27 17:20:46 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-16-30 goldmane-78d55f7ddc-5lg4z eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali82df9c8dd66 [] [] }} ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-" May 27 17:21:08.223671 containerd[2002]: 2025-05-27 17:21:08.046 [INFO][4863] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" May 27 17:21:08.223671 containerd[2002]: 2025-05-27 17:21:08.097 [INFO][4876] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" HandleID="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Workload="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.098 [INFO][4876] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" HandleID="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Workload="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d7050), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-16-30", "pod":"goldmane-78d55f7ddc-5lg4z", "timestamp":"2025-05-27 17:21:08.097951049 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.098 [INFO][4876] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.098 [INFO][4876] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.098 [INFO][4876] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.113 [INFO][4876] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" host="ip-172-31-16-30" May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.124 [INFO][4876] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.136 [INFO][4876] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.141 [INFO][4876] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.149 [INFO][4876] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:08.223967 containerd[2002]: 2025-05-27 17:21:08.149 [INFO][4876] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" host="ip-172-31-16-30" May 27 17:21:08.225756 containerd[2002]: 2025-05-27 17:21:08.154 [INFO][4876] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0 May 27 17:21:08.225756 containerd[2002]: 2025-05-27 17:21:08.164 [INFO][4876] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" host="ip-172-31-16-30" May 27 17:21:08.225756 containerd[2002]: 2025-05-27 17:21:08.177 [INFO][4876] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.66/26] block=192.168.44.64/26 handle="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" host="ip-172-31-16-30" May 27 17:21:08.225756 containerd[2002]: 2025-05-27 17:21:08.177 [INFO][4876] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.66/26] handle="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" host="ip-172-31-16-30" May 27 17:21:08.225756 containerd[2002]: 2025-05-27 17:21:08.177 [INFO][4876] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:08.225756 containerd[2002]: 2025-05-27 17:21:08.177 [INFO][4876] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.66/26] IPv6=[] ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" HandleID="k8s-pod-network.b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Workload="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" May 27 17:21:08.227062 containerd[2002]: 2025-05-27 17:21:08.181 [INFO][4863] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"bbacbf79-f2c8-4feb-9c01-4ff16f7741d3", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"goldmane-78d55f7ddc-5lg4z", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.44.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali82df9c8dd66", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:08.227062 containerd[2002]: 2025-05-27 17:21:08.182 [INFO][4863] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.66/32] ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" May 27 17:21:08.227306 containerd[2002]: 2025-05-27 17:21:08.182 [INFO][4863] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali82df9c8dd66 ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" May 27 17:21:08.227306 containerd[2002]: 2025-05-27 17:21:08.198 [INFO][4863] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" May 27 17:21:08.227398 containerd[2002]: 2025-05-27 17:21:08.199 [INFO][4863] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"bbacbf79-f2c8-4feb-9c01-4ff16f7741d3", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0", Pod:"goldmane-78d55f7ddc-5lg4z", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.44.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali82df9c8dd66", MAC:"42:55:95:6f:01:39", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:08.227520 containerd[2002]: 2025-05-27 17:21:08.218 [INFO][4863] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" Namespace="calico-system" Pod="goldmane-78d55f7ddc-5lg4z" WorkloadEndpoint="ip--172--31--16--30-k8s-goldmane--78d55f7ddc--5lg4z-eth0" May 27 17:21:08.268272 containerd[2002]: time="2025-05-27T17:21:08.267208206Z" level=info msg="connecting to shim b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0" address="unix:///run/containerd/s/3fb2371a6a7a4cc4202e00e61294880db4534c5927c27e298f7a4aa3f74d2174" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:08.324865 systemd[1]: Started cri-containerd-b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0.scope - libcontainer container b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0. May 27 17:21:08.425577 containerd[2002]: time="2025-05-27T17:21:08.425359459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-5lg4z,Uid:bbacbf79-f2c8-4feb-9c01-4ff16f7741d3,Namespace:calico-system,Attempt:0,} returns sandbox id \"b418746874885e3d1d4c679b2e49a066cf605b15186790c283b1738e4e6bd3b0\"" May 27 17:21:08.430970 containerd[2002]: time="2025-05-27T17:21:08.430688119Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:21:08.642344 containerd[2002]: time="2025-05-27T17:21:08.642272252Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:08.643610 containerd[2002]: time="2025-05-27T17:21:08.643547504Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:08.643737 containerd[2002]: time="2025-05-27T17:21:08.643672568Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:21:08.643993 kubelet[3467]: E0527 17:21:08.643936 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:21:08.644555 kubelet[3467]: E0527 17:21:08.644023 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:21:08.644555 kubelet[3467]: E0527 17:21:08.644325 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkjj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-5lg4z_calico-system(bbacbf79-f2c8-4feb-9c01-4ff16f7741d3): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:08.646171 kubelet[3467]: E0527 17:21:08.646061 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:21:08.952740 containerd[2002]: time="2025-05-27T17:21:08.951875265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-tszqg,Uid:46b9e02b-c554-45f6-ba93-5fe36b0b7377,Namespace:calico-apiserver,Attempt:0,}" May 27 17:21:08.952740 containerd[2002]: time="2025-05-27T17:21:08.952269453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9788fff8-dw2zq,Uid:ee528c19-273e-40e5-843b-77df0ad9a5c2,Namespace:calico-system,Attempt:0,}" May 27 17:21:08.952740 containerd[2002]: time="2025-05-27T17:21:08.952524273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-692vm,Uid:ab88850e-bafb-4e09-9926-6e07964ce97b,Namespace:calico-apiserver,Attempt:0,}" May 27 17:21:09.326015 kubelet[3467]: E0527 17:21:09.325934 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:21:09.445570 systemd-networkd[1901]: caliab59884af06: Link UP May 27 17:21:09.448419 systemd-networkd[1901]: caliab59884af06: Gained carrier May 27 17:21:09.494340 containerd[2002]: 2025-05-27 17:21:09.182 [INFO][4943] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0 calico-kube-controllers-7b9788fff8- calico-system ee528c19-273e-40e5-843b-77df0ad9a5c2 860 0 2025-05-27 17:20:47 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7b9788fff8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-16-30 calico-kube-controllers-7b9788fff8-dw2zq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] caliab59884af06 [] [] }} ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-" May 27 17:21:09.494340 containerd[2002]: 2025-05-27 17:21:09.183 [INFO][4943] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" May 27 17:21:09.494340 containerd[2002]: 2025-05-27 17:21:09.291 [INFO][4978] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" HandleID="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Workload="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.291 [INFO][4978] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" HandleID="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Workload="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000100310), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-16-30", "pod":"calico-kube-controllers-7b9788fff8-dw2zq", "timestamp":"2025-05-27 17:21:09.291153991 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.291 [INFO][4978] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.291 [INFO][4978] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.291 [INFO][4978] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.329 [INFO][4978] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" host="ip-172-31-16-30" May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.349 [INFO][4978] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.365 [INFO][4978] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.376 [INFO][4978] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.495684 containerd[2002]: 2025-05-27 17:21:09.391 [INFO][4978] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.496169 containerd[2002]: 2025-05-27 17:21:09.391 [INFO][4978] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" host="ip-172-31-16-30" May 27 17:21:09.496169 containerd[2002]: 2025-05-27 17:21:09.398 [INFO][4978] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab May 27 17:21:09.496169 containerd[2002]: 2025-05-27 17:21:09.410 [INFO][4978] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" host="ip-172-31-16-30" May 27 17:21:09.496169 containerd[2002]: 2025-05-27 17:21:09.426 [INFO][4978] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.67/26] block=192.168.44.64/26 handle="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" host="ip-172-31-16-30" May 27 17:21:09.496169 containerd[2002]: 2025-05-27 17:21:09.426 [INFO][4978] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.67/26] handle="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" host="ip-172-31-16-30" May 27 17:21:09.496169 containerd[2002]: 2025-05-27 17:21:09.426 [INFO][4978] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:09.496169 containerd[2002]: 2025-05-27 17:21:09.427 [INFO][4978] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.67/26] IPv6=[] ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" HandleID="k8s-pod-network.bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Workload="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" May 27 17:21:09.498970 containerd[2002]: 2025-05-27 17:21:09.436 [INFO][4943] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0", GenerateName:"calico-kube-controllers-7b9788fff8-", Namespace:"calico-system", SelfLink:"", UID:"ee528c19-273e-40e5-843b-77df0ad9a5c2", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b9788fff8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"calico-kube-controllers-7b9788fff8-dw2zq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.44.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliab59884af06", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:09.499134 containerd[2002]: 2025-05-27 17:21:09.437 [INFO][4943] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.67/32] ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" May 27 17:21:09.499134 containerd[2002]: 2025-05-27 17:21:09.437 [INFO][4943] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab59884af06 ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" May 27 17:21:09.499134 containerd[2002]: 2025-05-27 17:21:09.451 [INFO][4943] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" May 27 17:21:09.500383 containerd[2002]: 2025-05-27 17:21:09.455 [INFO][4943] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0", GenerateName:"calico-kube-controllers-7b9788fff8-", Namespace:"calico-system", SelfLink:"", UID:"ee528c19-273e-40e5-843b-77df0ad9a5c2", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b9788fff8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab", Pod:"calico-kube-controllers-7b9788fff8-dw2zq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.44.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliab59884af06", MAC:"aa:80:db:6d:9f:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:09.501578 containerd[2002]: 2025-05-27 17:21:09.487 [INFO][4943] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" Namespace="calico-system" Pod="calico-kube-controllers-7b9788fff8-dw2zq" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--kube--controllers--7b9788fff8--dw2zq-eth0" May 27 17:21:09.571713 containerd[2002]: time="2025-05-27T17:21:09.571593788Z" level=info msg="connecting to shim bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab" address="unix:///run/containerd/s/6f1af3cd3f4ee6a028ebd9259c2fd786f621b782caf527f5810e08062f434a0c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:09.608508 systemd-networkd[1901]: cali4f940b5f7d2: Link UP May 27 17:21:09.612625 systemd-networkd[1901]: cali4f940b5f7d2: Gained carrier May 27 17:21:09.693894 containerd[2002]: 2025-05-27 17:21:09.190 [INFO][4947] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0 calico-apiserver-58f774c4bf- calico-apiserver ab88850e-bafb-4e09-9926-6e07964ce97b 863 0 2025-05-27 17:20:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58f774c4bf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-16-30 calico-apiserver-58f774c4bf-692vm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4f940b5f7d2 [] [] }} ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-" May 27 17:21:09.693894 containerd[2002]: 2025-05-27 17:21:09.190 [INFO][4947] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" May 27 17:21:09.693894 containerd[2002]: 2025-05-27 17:21:09.394 [INFO][4980] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" HandleID="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Workload="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.394 [INFO][4980] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" HandleID="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Workload="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000330930), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-16-30", "pod":"calico-apiserver-58f774c4bf-692vm", "timestamp":"2025-05-27 17:21:09.389197051 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.395 [INFO][4980] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.426 [INFO][4980] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.427 [INFO][4980] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.472 [INFO][4980] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" host="ip-172-31-16-30" May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.490 [INFO][4980] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.519 [INFO][4980] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.525 [INFO][4980] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.694207 containerd[2002]: 2025-05-27 17:21:09.541 [INFO][4980] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.694664 containerd[2002]: 2025-05-27 17:21:09.541 [INFO][4980] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" host="ip-172-31-16-30" May 27 17:21:09.694664 containerd[2002]: 2025-05-27 17:21:09.547 [INFO][4980] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf May 27 17:21:09.694664 containerd[2002]: 2025-05-27 17:21:09.558 [INFO][4980] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" host="ip-172-31-16-30" May 27 17:21:09.694664 containerd[2002]: 2025-05-27 17:21:09.575 [INFO][4980] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.68/26] block=192.168.44.64/26 handle="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" host="ip-172-31-16-30" May 27 17:21:09.694664 containerd[2002]: 2025-05-27 17:21:09.576 [INFO][4980] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.68/26] handle="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" host="ip-172-31-16-30" May 27 17:21:09.694664 containerd[2002]: 2025-05-27 17:21:09.577 [INFO][4980] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:09.694664 containerd[2002]: 2025-05-27 17:21:09.577 [INFO][4980] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.68/26] IPv6=[] ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" HandleID="k8s-pod-network.065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Workload="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" May 27 17:21:09.695070 containerd[2002]: 2025-05-27 17:21:09.585 [INFO][4947] cni-plugin/k8s.go 418: Populated endpoint ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0", GenerateName:"calico-apiserver-58f774c4bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"ab88850e-bafb-4e09-9926-6e07964ce97b", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58f774c4bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"calico-apiserver-58f774c4bf-692vm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4f940b5f7d2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:09.695204 containerd[2002]: 2025-05-27 17:21:09.586 [INFO][4947] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.68/32] ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" May 27 17:21:09.695204 containerd[2002]: 2025-05-27 17:21:09.586 [INFO][4947] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4f940b5f7d2 ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" May 27 17:21:09.695204 containerd[2002]: 2025-05-27 17:21:09.616 [INFO][4947] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" May 27 17:21:09.701043 containerd[2002]: 2025-05-27 17:21:09.631 [INFO][4947] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0", GenerateName:"calico-apiserver-58f774c4bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"ab88850e-bafb-4e09-9926-6e07964ce97b", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58f774c4bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf", Pod:"calico-apiserver-58f774c4bf-692vm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4f940b5f7d2", MAC:"62:7d:0e:ad:52:56", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:09.701201 containerd[2002]: 2025-05-27 17:21:09.686 [INFO][4947] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-692vm" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--692vm-eth0" May 27 17:21:09.707664 systemd[1]: Started cri-containerd-bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab.scope - libcontainer container bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab. May 27 17:21:09.771441 containerd[2002]: time="2025-05-27T17:21:09.771349545Z" level=info msg="connecting to shim 065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf" address="unix:///run/containerd/s/c9622e9489ad86e34b894e85910a4f08040fb9fe118165aeb2e70df086bd89ab" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:09.827187 systemd-networkd[1901]: cali8239f18a545: Link UP May 27 17:21:09.830910 systemd-networkd[1901]: cali8239f18a545: Gained carrier May 27 17:21:09.889326 containerd[2002]: 2025-05-27 17:21:09.177 [INFO][4940] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0 calico-apiserver-58f774c4bf- calico-apiserver 46b9e02b-c554-45f6-ba93-5fe36b0b7377 864 0 2025-05-27 17:20:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58f774c4bf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-16-30 calico-apiserver-58f774c4bf-tszqg eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8239f18a545 [] [] }} ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-" May 27 17:21:09.889326 containerd[2002]: 2025-05-27 17:21:09.177 [INFO][4940] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" May 27 17:21:09.889326 containerd[2002]: 2025-05-27 17:21:09.395 [INFO][4976] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" HandleID="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Workload="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.397 [INFO][4976] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" HandleID="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Workload="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d7740), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-16-30", "pod":"calico-apiserver-58f774c4bf-tszqg", "timestamp":"2025-05-27 17:21:09.395206375 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.397 [INFO][4976] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.577 [INFO][4976] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.577 [INFO][4976] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.644 [INFO][4976] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" host="ip-172-31-16-30" May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.667 [INFO][4976] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.703 [INFO][4976] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.717 [INFO][4976] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.889987 containerd[2002]: 2025-05-27 17:21:09.727 [INFO][4976] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:09.894955 containerd[2002]: 2025-05-27 17:21:09.730 [INFO][4976] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" host="ip-172-31-16-30" May 27 17:21:09.894955 containerd[2002]: 2025-05-27 17:21:09.753 [INFO][4976] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942 May 27 17:21:09.894955 containerd[2002]: 2025-05-27 17:21:09.776 [INFO][4976] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" host="ip-172-31-16-30" May 27 17:21:09.894955 containerd[2002]: 2025-05-27 17:21:09.794 [INFO][4976] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.69/26] block=192.168.44.64/26 handle="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" host="ip-172-31-16-30" May 27 17:21:09.894955 containerd[2002]: 2025-05-27 17:21:09.795 [INFO][4976] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.69/26] handle="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" host="ip-172-31-16-30" May 27 17:21:09.894955 containerd[2002]: 2025-05-27 17:21:09.795 [INFO][4976] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:09.894955 containerd[2002]: 2025-05-27 17:21:09.795 [INFO][4976] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.69/26] IPv6=[] ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" HandleID="k8s-pod-network.8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Workload="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" May 27 17:21:09.890682 systemd[1]: Started cri-containerd-065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf.scope - libcontainer container 065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf. May 27 17:21:09.897001 containerd[2002]: 2025-05-27 17:21:09.811 [INFO][4940] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0", GenerateName:"calico-apiserver-58f774c4bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"46b9e02b-c554-45f6-ba93-5fe36b0b7377", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58f774c4bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"calico-apiserver-58f774c4bf-tszqg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8239f18a545", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:09.897140 containerd[2002]: 2025-05-27 17:21:09.812 [INFO][4940] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.69/32] ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" May 27 17:21:09.897140 containerd[2002]: 2025-05-27 17:21:09.812 [INFO][4940] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8239f18a545 ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" May 27 17:21:09.897140 containerd[2002]: 2025-05-27 17:21:09.836 [INFO][4940] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" May 27 17:21:09.897320 containerd[2002]: 2025-05-27 17:21:09.839 [INFO][4940] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0", GenerateName:"calico-apiserver-58f774c4bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"46b9e02b-c554-45f6-ba93-5fe36b0b7377", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58f774c4bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942", Pod:"calico-apiserver-58f774c4bf-tszqg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8239f18a545", MAC:"86:21:56:03:73:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:09.897466 containerd[2002]: 2025-05-27 17:21:09.874 [INFO][4940] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" Namespace="calico-apiserver" Pod="calico-apiserver-58f774c4bf-tszqg" WorkloadEndpoint="ip--172--31--16--30-k8s-calico--apiserver--58f774c4bf--tszqg-eth0" May 27 17:21:09.924516 systemd-networkd[1901]: cali82df9c8dd66: Gained IPv6LL May 27 17:21:09.949846 containerd[2002]: time="2025-05-27T17:21:09.949691782Z" level=info msg="connecting to shim 8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942" address="unix:///run/containerd/s/6812eeb1e9962fd4af231ff4ca800293591a06d95f2b474c2f2c8865eadb1af0" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:09.956698 containerd[2002]: time="2025-05-27T17:21:09.956622130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5rhxb,Uid:0b9f82fa-5aa9-4828-ae1b-71591df99003,Namespace:calico-system,Attempt:0,}" May 27 17:21:10.077141 containerd[2002]: time="2025-05-27T17:21:10.077074699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b9788fff8-dw2zq,Uid:ee528c19-273e-40e5-843b-77df0ad9a5c2,Namespace:calico-system,Attempt:0,} returns sandbox id \"bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab\"" May 27 17:21:10.090772 containerd[2002]: time="2025-05-27T17:21:10.090611239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 17:21:10.138551 systemd[1]: Started cri-containerd-8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942.scope - libcontainer container 8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942. May 27 17:21:10.194304 containerd[2002]: time="2025-05-27T17:21:10.193714987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-692vm,Uid:ab88850e-bafb-4e09-9926-6e07964ce97b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf\"" May 27 17:21:10.343286 kubelet[3467]: E0527 17:21:10.341874 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:21:10.434573 systemd-networkd[1901]: calie1efc29e0ef: Link UP May 27 17:21:10.438100 systemd-networkd[1901]: calie1efc29e0ef: Gained carrier May 27 17:21:10.492681 containerd[2002]: 2025-05-27 17:21:10.171 [INFO][5122] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0 csi-node-driver- calico-system 0b9f82fa-5aa9-4828-ae1b-71591df99003 734 0 2025-05-27 17:20:47 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-16-30 csi-node-driver-5rhxb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie1efc29e0ef [] [] }} ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-" May 27 17:21:10.492681 containerd[2002]: 2025-05-27 17:21:10.173 [INFO][5122] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" May 27 17:21:10.492681 containerd[2002]: 2025-05-27 17:21:10.252 [INFO][5173] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" HandleID="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Workload="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.252 [INFO][5173] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" HandleID="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Workload="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d7a10), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-16-30", "pod":"csi-node-driver-5rhxb", "timestamp":"2025-05-27 17:21:10.251971004 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.252 [INFO][5173] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.252 [INFO][5173] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.253 [INFO][5173] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.288 [INFO][5173] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" host="ip-172-31-16-30" May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.300 [INFO][5173] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.311 [INFO][5173] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.329 [INFO][5173] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.341 [INFO][5173] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:10.492967 containerd[2002]: 2025-05-27 17:21:10.341 [INFO][5173] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" host="ip-172-31-16-30" May 27 17:21:10.495694 containerd[2002]: 2025-05-27 17:21:10.345 [INFO][5173] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958 May 27 17:21:10.495694 containerd[2002]: 2025-05-27 17:21:10.366 [INFO][5173] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" host="ip-172-31-16-30" May 27 17:21:10.495694 containerd[2002]: 2025-05-27 17:21:10.416 [INFO][5173] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.70/26] block=192.168.44.64/26 handle="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" host="ip-172-31-16-30" May 27 17:21:10.495694 containerd[2002]: 2025-05-27 17:21:10.417 [INFO][5173] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.70/26] handle="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" host="ip-172-31-16-30" May 27 17:21:10.495694 containerd[2002]: 2025-05-27 17:21:10.417 [INFO][5173] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:10.495694 containerd[2002]: 2025-05-27 17:21:10.417 [INFO][5173] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.70/26] IPv6=[] ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" HandleID="k8s-pod-network.7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Workload="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" May 27 17:21:10.495960 containerd[2002]: 2025-05-27 17:21:10.422 [INFO][5122] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0b9f82fa-5aa9-4828-ae1b-71591df99003", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"csi-node-driver-5rhxb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.44.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie1efc29e0ef", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:10.496098 containerd[2002]: 2025-05-27 17:21:10.423 [INFO][5122] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.70/32] ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" May 27 17:21:10.496098 containerd[2002]: 2025-05-27 17:21:10.426 [INFO][5122] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie1efc29e0ef ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" May 27 17:21:10.496098 containerd[2002]: 2025-05-27 17:21:10.438 [INFO][5122] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" May 27 17:21:10.496826 containerd[2002]: 2025-05-27 17:21:10.440 [INFO][5122] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0b9f82fa-5aa9-4828-ae1b-71591df99003", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958", Pod:"csi-node-driver-5rhxb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.44.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie1efc29e0ef", MAC:"c2:95:3e:03:74:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:10.497003 containerd[2002]: 2025-05-27 17:21:10.486 [INFO][5122] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" Namespace="calico-system" Pod="csi-node-driver-5rhxb" WorkloadEndpoint="ip--172--31--16--30-k8s-csi--node--driver--5rhxb-eth0" May 27 17:21:10.570396 containerd[2002]: time="2025-05-27T17:21:10.570203637Z" level=info msg="connecting to shim 7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958" address="unix:///run/containerd/s/6f6075ab2d01141f9f5899655e0c4e9e8da170865c80545badaa35e551e06f5b" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:10.674203 containerd[2002]: time="2025-05-27T17:21:10.674133322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58f774c4bf-tszqg,Uid:46b9e02b-c554-45f6-ba93-5fe36b0b7377,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942\"" May 27 17:21:10.693665 systemd[1]: Started cri-containerd-7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958.scope - libcontainer container 7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958. May 27 17:21:10.755543 systemd-networkd[1901]: caliab59884af06: Gained IPv6LL May 27 17:21:10.803166 containerd[2002]: time="2025-05-27T17:21:10.802485670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5rhxb,Uid:0b9f82fa-5aa9-4828-ae1b-71591df99003,Namespace:calico-system,Attempt:0,} returns sandbox id \"7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958\"" May 27 17:21:10.951751 containerd[2002]: time="2025-05-27T17:21:10.951614651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-srvks,Uid:897829e3-eb94-4c94-8bd8-d6fd7e2f0124,Namespace:kube-system,Attempt:0,}" May 27 17:21:10.952102 containerd[2002]: time="2025-05-27T17:21:10.951637991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8r7td,Uid:e542b2ab-32f6-486f-a8f7-2027e557168c,Namespace:kube-system,Attempt:0,}" May 27 17:21:11.011596 systemd-networkd[1901]: cali4f940b5f7d2: Gained IPv6LL May 27 17:21:11.310762 systemd-networkd[1901]: cali38692d9feb7: Link UP May 27 17:21:11.312024 systemd-networkd[1901]: cali38692d9feb7: Gained carrier May 27 17:21:11.378596 containerd[2002]: 2025-05-27 17:21:11.087 [INFO][5250] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0 coredns-674b8bbfcf- kube-system e542b2ab-32f6-486f-a8f7-2027e557168c 859 0 2025-05-27 17:20:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-16-30 coredns-674b8bbfcf-8r7td eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali38692d9feb7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-" May 27 17:21:11.378596 containerd[2002]: 2025-05-27 17:21:11.088 [INFO][5250] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" May 27 17:21:11.378596 containerd[2002]: 2025-05-27 17:21:11.186 [INFO][5271] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" HandleID="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Workload="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.187 [INFO][5271] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" HandleID="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Workload="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002310d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-16-30", "pod":"coredns-674b8bbfcf-8r7td", "timestamp":"2025-05-27 17:21:11.186510992 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.187 [INFO][5271] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.187 [INFO][5271] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.187 [INFO][5271] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.219 [INFO][5271] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" host="ip-172-31-16-30" May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.247 [INFO][5271] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.259 [INFO][5271] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.264 [INFO][5271] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.270 [INFO][5271] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:11.378920 containerd[2002]: 2025-05-27 17:21:11.270 [INFO][5271] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" host="ip-172-31-16-30" May 27 17:21:11.379510 containerd[2002]: 2025-05-27 17:21:11.274 [INFO][5271] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9 May 27 17:21:11.379510 containerd[2002]: 2025-05-27 17:21:11.285 [INFO][5271] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" host="ip-172-31-16-30" May 27 17:21:11.379510 containerd[2002]: 2025-05-27 17:21:11.299 [INFO][5271] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.71/26] block=192.168.44.64/26 handle="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" host="ip-172-31-16-30" May 27 17:21:11.379510 containerd[2002]: 2025-05-27 17:21:11.299 [INFO][5271] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.71/26] handle="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" host="ip-172-31-16-30" May 27 17:21:11.379510 containerd[2002]: 2025-05-27 17:21:11.299 [INFO][5271] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:11.379510 containerd[2002]: 2025-05-27 17:21:11.299 [INFO][5271] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.71/26] IPv6=[] ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" HandleID="k8s-pod-network.24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Workload="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" May 27 17:21:11.379792 containerd[2002]: 2025-05-27 17:21:11.305 [INFO][5250] cni-plugin/k8s.go 418: Populated endpoint ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e542b2ab-32f6-486f-a8f7-2027e557168c", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"coredns-674b8bbfcf-8r7td", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali38692d9feb7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:11.379792 containerd[2002]: 2025-05-27 17:21:11.306 [INFO][5250] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.71/32] ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" May 27 17:21:11.379792 containerd[2002]: 2025-05-27 17:21:11.306 [INFO][5250] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali38692d9feb7 ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" May 27 17:21:11.379792 containerd[2002]: 2025-05-27 17:21:11.312 [INFO][5250] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" May 27 17:21:11.379792 containerd[2002]: 2025-05-27 17:21:11.318 [INFO][5250] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e542b2ab-32f6-486f-a8f7-2027e557168c", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9", Pod:"coredns-674b8bbfcf-8r7td", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali38692d9feb7", MAC:"92:1b:3f:ed:4d:88", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:11.379792 containerd[2002]: 2025-05-27 17:21:11.358 [INFO][5250] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" Namespace="kube-system" Pod="coredns-674b8bbfcf-8r7td" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--8r7td-eth0" May 27 17:21:11.487513 containerd[2002]: time="2025-05-27T17:21:11.487431094Z" level=info msg="connecting to shim 24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9" address="unix:///run/containerd/s/415337889a9a2e04954b921f2f1432cf6ffe18e0dd163b30c70c1036dd97ac26" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:11.537072 systemd-networkd[1901]: calid2e6c19459a: Link UP May 27 17:21:11.545276 systemd-networkd[1901]: calid2e6c19459a: Gained carrier May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.082 [INFO][5244] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0 coredns-674b8bbfcf- kube-system 897829e3-eb94-4c94-8bd8-d6fd7e2f0124 861 0 2025-05-27 17:20:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-16-30 coredns-674b8bbfcf-srvks eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid2e6c19459a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.082 [INFO][5244] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.193 [INFO][5269] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" HandleID="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Workload="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.194 [INFO][5269] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" HandleID="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Workload="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003158f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-16-30", "pod":"coredns-674b8bbfcf-srvks", "timestamp":"2025-05-27 17:21:11.193922276 +0000 UTC"}, Hostname:"ip-172-31-16-30", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.194 [INFO][5269] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.300 [INFO][5269] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.300 [INFO][5269] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-16-30' May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.327 [INFO][5269] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.374 [INFO][5269] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.403 [INFO][5269] ipam/ipam.go 511: Trying affinity for 192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.410 [INFO][5269] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.419 [INFO][5269] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.64/26 host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.421 [INFO][5269] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.64/26 handle="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.430 [INFO][5269] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3 May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.462 [INFO][5269] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.64/26 handle="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.486 [INFO][5269] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.72/26] block=192.168.44.64/26 handle="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.486 [INFO][5269] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.72/26] handle="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" host="ip-172-31-16-30" May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.486 [INFO][5269] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:21:11.605971 containerd[2002]: 2025-05-27 17:21:11.486 [INFO][5269] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.72/26] IPv6=[] ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" HandleID="k8s-pod-network.e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Workload="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" May 27 17:21:11.611995 containerd[2002]: 2025-05-27 17:21:11.505 [INFO][5244] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"897829e3-eb94-4c94-8bd8-d6fd7e2f0124", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"", Pod:"coredns-674b8bbfcf-srvks", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid2e6c19459a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:11.611995 containerd[2002]: 2025-05-27 17:21:11.508 [INFO][5244] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.72/32] ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" May 27 17:21:11.611995 containerd[2002]: 2025-05-27 17:21:11.510 [INFO][5244] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid2e6c19459a ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" May 27 17:21:11.611995 containerd[2002]: 2025-05-27 17:21:11.553 [INFO][5244] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" May 27 17:21:11.611995 containerd[2002]: 2025-05-27 17:21:11.563 [INFO][5244] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"897829e3-eb94-4c94-8bd8-d6fd7e2f0124", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 20, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-16-30", ContainerID:"e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3", Pod:"coredns-674b8bbfcf-srvks", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid2e6c19459a", MAC:"36:f4:0b:c6:70:80", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:21:11.611995 containerd[2002]: 2025-05-27 17:21:11.592 [INFO][5244] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" Namespace="kube-system" Pod="coredns-674b8bbfcf-srvks" WorkloadEndpoint="ip--172--31--16--30-k8s-coredns--674b8bbfcf--srvks-eth0" May 27 17:21:11.658650 systemd[1]: Started cri-containerd-24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9.scope - libcontainer container 24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9. May 27 17:21:11.763650 containerd[2002]: time="2025-05-27T17:21:11.763374587Z" level=info msg="connecting to shim e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3" address="unix:///run/containerd/s/7399d433a58e004711bc352b72bda07cf0cbae4c5eb8acd13efc8bb97f6c0ca4" namespace=k8s.io protocol=ttrpc version=3 May 27 17:21:11.779546 systemd-networkd[1901]: cali8239f18a545: Gained IPv6LL May 27 17:21:11.850558 systemd[1]: Started cri-containerd-e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3.scope - libcontainer container e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3. May 27 17:21:11.894797 containerd[2002]: time="2025-05-27T17:21:11.894214584Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-8r7td,Uid:e542b2ab-32f6-486f-a8f7-2027e557168c,Namespace:kube-system,Attempt:0,} returns sandbox id \"24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9\"" May 27 17:21:11.914900 containerd[2002]: time="2025-05-27T17:21:11.914834172Z" level=info msg="CreateContainer within sandbox \"24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:21:11.951553 containerd[2002]: time="2025-05-27T17:21:11.951482160Z" level=info msg="Container 0f30eff2f0cd236763d753225e1ff3ccefb3e688a03cb84d73c7c6ae58054e91: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:11.986260 containerd[2002]: time="2025-05-27T17:21:11.985727076Z" level=info msg="CreateContainer within sandbox \"24af0b8b306540ad97f069df0ecb18a8d28edd430d02824a7eb023d2b1b8d2e9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0f30eff2f0cd236763d753225e1ff3ccefb3e688a03cb84d73c7c6ae58054e91\"" May 27 17:21:11.992522 containerd[2002]: time="2025-05-27T17:21:11.991455492Z" level=info msg="StartContainer for \"0f30eff2f0cd236763d753225e1ff3ccefb3e688a03cb84d73c7c6ae58054e91\"" May 27 17:21:11.998926 containerd[2002]: time="2025-05-27T17:21:11.998855352Z" level=info msg="connecting to shim 0f30eff2f0cd236763d753225e1ff3ccefb3e688a03cb84d73c7c6ae58054e91" address="unix:///run/containerd/s/415337889a9a2e04954b921f2f1432cf6ffe18e0dd163b30c70c1036dd97ac26" protocol=ttrpc version=3 May 27 17:21:12.049995 containerd[2002]: time="2025-05-27T17:21:12.049925397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-srvks,Uid:897829e3-eb94-4c94-8bd8-d6fd7e2f0124,Namespace:kube-system,Attempt:0,} returns sandbox id \"e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3\"" May 27 17:21:12.066212 containerd[2002]: time="2025-05-27T17:21:12.066145917Z" level=info msg="CreateContainer within sandbox \"e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:21:12.082055 systemd[1]: Started cri-containerd-0f30eff2f0cd236763d753225e1ff3ccefb3e688a03cb84d73c7c6ae58054e91.scope - libcontainer container 0f30eff2f0cd236763d753225e1ff3ccefb3e688a03cb84d73c7c6ae58054e91. May 27 17:21:12.109947 containerd[2002]: time="2025-05-27T17:21:12.109882449Z" level=info msg="Container af6bd3b23cb1c8dee66401daf6665022e4c641535072f5beec076adc7894d978: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:12.126823 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2103948892.mount: Deactivated successfully. May 27 17:21:12.138766 containerd[2002]: time="2025-05-27T17:21:12.138598593Z" level=info msg="CreateContainer within sandbox \"e6cf30fa6ecde6a1982c8559fa3d0e0d975041bd219e8514ee5ee705ec82e6d3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"af6bd3b23cb1c8dee66401daf6665022e4c641535072f5beec076adc7894d978\"" May 27 17:21:12.140338 containerd[2002]: time="2025-05-27T17:21:12.140264853Z" level=info msg="StartContainer for \"af6bd3b23cb1c8dee66401daf6665022e4c641535072f5beec076adc7894d978\"" May 27 17:21:12.142587 containerd[2002]: time="2025-05-27T17:21:12.142484637Z" level=info msg="connecting to shim af6bd3b23cb1c8dee66401daf6665022e4c641535072f5beec076adc7894d978" address="unix:///run/containerd/s/7399d433a58e004711bc352b72bda07cf0cbae4c5eb8acd13efc8bb97f6c0ca4" protocol=ttrpc version=3 May 27 17:21:12.203578 systemd[1]: Started cri-containerd-af6bd3b23cb1c8dee66401daf6665022e4c641535072f5beec076adc7894d978.scope - libcontainer container af6bd3b23cb1c8dee66401daf6665022e4c641535072f5beec076adc7894d978. May 27 17:21:12.256371 containerd[2002]: time="2025-05-27T17:21:12.255827110Z" level=info msg="StartContainer for \"0f30eff2f0cd236763d753225e1ff3ccefb3e688a03cb84d73c7c6ae58054e91\" returns successfully" May 27 17:21:12.333652 containerd[2002]: time="2025-05-27T17:21:12.333531346Z" level=info msg="StartContainer for \"af6bd3b23cb1c8dee66401daf6665022e4c641535072f5beec076adc7894d978\" returns successfully" May 27 17:21:12.420017 systemd-networkd[1901]: calie1efc29e0ef: Gained IPv6LL May 27 17:21:12.478678 kubelet[3467]: I0527 17:21:12.478481 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-8r7td" podStartSLOduration=49.478457207 podStartE2EDuration="49.478457207s" podCreationTimestamp="2025-05-27 17:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:21:12.452175959 +0000 UTC m=+54.869055838" watchObservedRunningTime="2025-05-27 17:21:12.478457207 +0000 UTC m=+54.895337026" May 27 17:21:13.059741 systemd-networkd[1901]: cali38692d9feb7: Gained IPv6LL May 27 17:21:13.187629 systemd-networkd[1901]: calid2e6c19459a: Gained IPv6LL May 27 17:21:13.430894 kubelet[3467]: I0527 17:21:13.430689 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-srvks" podStartSLOduration=50.430663991 podStartE2EDuration="50.430663991s" podCreationTimestamp="2025-05-27 17:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:21:12.481902851 +0000 UTC m=+54.898782766" watchObservedRunningTime="2025-05-27 17:21:13.430663991 +0000 UTC m=+55.847543810" May 27 17:21:14.903116 containerd[2002]: time="2025-05-27T17:21:14.902940735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:14.904323 containerd[2002]: time="2025-05-27T17:21:14.904260039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=48045219" May 27 17:21:14.905370 containerd[2002]: time="2025-05-27T17:21:14.905279427Z" level=info msg="ImageCreate event name:\"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:14.909617 containerd[2002]: time="2025-05-27T17:21:14.909471591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:14.911277 containerd[2002]: time="2025-05-27T17:21:14.911049723Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"49414428\" in 4.81939338s" May 27 17:21:14.911277 containerd[2002]: time="2025-05-27T17:21:14.911111031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\"" May 27 17:21:14.914165 containerd[2002]: time="2025-05-27T17:21:14.913363275Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:21:14.947112 containerd[2002]: time="2025-05-27T17:21:14.947002515Z" level=info msg="CreateContainer within sandbox \"bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 17:21:14.963659 containerd[2002]: time="2025-05-27T17:21:14.963463479Z" level=info msg="Container ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:14.974695 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount222058479.mount: Deactivated successfully. May 27 17:21:15.007459 containerd[2002]: time="2025-05-27T17:21:15.007314755Z" level=info msg="CreateContainer within sandbox \"bc3f2c2a1b66700f505e8e359469665c8f389d83dea04b389362e42f61703fab\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\"" May 27 17:21:15.008563 containerd[2002]: time="2025-05-27T17:21:15.008256971Z" level=info msg="StartContainer for \"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\"" May 27 17:21:15.011820 containerd[2002]: time="2025-05-27T17:21:15.011767631Z" level=info msg="connecting to shim ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7" address="unix:///run/containerd/s/6f1af3cd3f4ee6a028ebd9259c2fd786f621b782caf527f5810e08062f434a0c" protocol=ttrpc version=3 May 27 17:21:15.101865 systemd[1]: Started cri-containerd-ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7.scope - libcontainer container ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7. May 27 17:21:15.241800 containerd[2002]: time="2025-05-27T17:21:15.240991008Z" level=info msg="StartContainer for \"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\" returns successfully" May 27 17:21:15.566118 ntpd[1969]: Listen normally on 8 vxlan.calico 192.168.44.64:123 May 27 17:21:15.566720 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 8 vxlan.calico 192.168.44.64:123 May 27 17:21:15.566827 ntpd[1969]: Listen normally on 9 cali2d02cb2bcff [fe80::ecee:eeff:feee:eeee%4]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 9 cali2d02cb2bcff [fe80::ecee:eeff:feee:eeee%4]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 10 vxlan.calico [fe80::6401:84ff:fefc:f50a%5]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 11 cali82df9c8dd66 [fe80::ecee:eeff:feee:eeee%8]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 12 caliab59884af06 [fe80::ecee:eeff:feee:eeee%9]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 13 cali4f940b5f7d2 [fe80::ecee:eeff:feee:eeee%10]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 14 cali8239f18a545 [fe80::ecee:eeff:feee:eeee%11]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 15 calie1efc29e0ef [fe80::ecee:eeff:feee:eeee%12]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 16 cali38692d9feb7 [fe80::ecee:eeff:feee:eeee%13]:123 May 27 17:21:15.567712 ntpd[1969]: 27 May 17:21:15 ntpd[1969]: Listen normally on 17 calid2e6c19459a [fe80::ecee:eeff:feee:eeee%14]:123 May 27 17:21:15.566941 ntpd[1969]: Listen normally on 10 vxlan.calico [fe80::6401:84ff:fefc:f50a%5]:123 May 27 17:21:15.567009 ntpd[1969]: Listen normally on 11 cali82df9c8dd66 [fe80::ecee:eeff:feee:eeee%8]:123 May 27 17:21:15.567107 ntpd[1969]: Listen normally on 12 caliab59884af06 [fe80::ecee:eeff:feee:eeee%9]:123 May 27 17:21:15.567183 ntpd[1969]: Listen normally on 13 cali4f940b5f7d2 [fe80::ecee:eeff:feee:eeee%10]:123 May 27 17:21:15.567299 ntpd[1969]: Listen normally on 14 cali8239f18a545 [fe80::ecee:eeff:feee:eeee%11]:123 May 27 17:21:15.567367 ntpd[1969]: Listen normally on 15 calie1efc29e0ef [fe80::ecee:eeff:feee:eeee%12]:123 May 27 17:21:15.567434 ntpd[1969]: Listen normally on 16 cali38692d9feb7 [fe80::ecee:eeff:feee:eeee%13]:123 May 27 17:21:15.567498 ntpd[1969]: Listen normally on 17 calid2e6c19459a [fe80::ecee:eeff:feee:eeee%14]:123 May 27 17:21:15.586606 containerd[2002]: time="2025-05-27T17:21:15.586538330Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\" id:\"c6d52a43a2ee5ef438f0d048a686cbdce2b1ada63c71a27b9b9ebc388382dec8\" pid:5529 exited_at:{seconds:1748366475 nanos:585486614}" May 27 17:21:15.623878 kubelet[3467]: I0527 17:21:15.623713 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7b9788fff8-dw2zq" podStartSLOduration=23.799554766 podStartE2EDuration="28.623667638s" podCreationTimestamp="2025-05-27 17:20:47 +0000 UTC" firstStartedPulling="2025-05-27 17:21:10.088848739 +0000 UTC m=+52.505728558" lastFinishedPulling="2025-05-27 17:21:14.912961623 +0000 UTC m=+57.329841430" observedRunningTime="2025-05-27 17:21:15.472534502 +0000 UTC m=+57.889414333" watchObservedRunningTime="2025-05-27 17:21:15.623667638 +0000 UTC m=+58.040547457" May 27 17:21:15.636660 systemd[1]: Started sshd@9-172.31.16.30:22-139.178.68.195:38782.service - OpenSSH per-connection server daemon (139.178.68.195:38782). May 27 17:21:15.857881 sshd[5546]: Accepted publickey for core from 139.178.68.195 port 38782 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:15.860419 sshd-session[5546]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:15.870368 systemd-logind[1976]: New session 10 of user core. May 27 17:21:15.878498 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 17:21:16.213921 sshd[5548]: Connection closed by 139.178.68.195 port 38782 May 27 17:21:16.214758 sshd-session[5546]: pam_unix(sshd:session): session closed for user core May 27 17:21:16.228574 systemd[1]: sshd@9-172.31.16.30:22-139.178.68.195:38782.service: Deactivated successfully. May 27 17:21:16.231324 systemd-logind[1976]: Session 10 logged out. Waiting for processes to exit. May 27 17:21:16.237409 systemd[1]: session-10.scope: Deactivated successfully. May 27 17:21:16.245714 systemd-logind[1976]: Removed session 10. May 27 17:21:19.748626 containerd[2002]: time="2025-05-27T17:21:19.748535095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:19.751464 containerd[2002]: time="2025-05-27T17:21:19.751400347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=44453213" May 27 17:21:19.754209 containerd[2002]: time="2025-05-27T17:21:19.754118683Z" level=info msg="ImageCreate event name:\"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:19.759020 containerd[2002]: time="2025-05-27T17:21:19.758902135Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:19.760668 containerd[2002]: time="2025-05-27T17:21:19.760336831Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 4.84691766s" May 27 17:21:19.760668 containerd[2002]: time="2025-05-27T17:21:19.760434643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 17:21:19.762197 containerd[2002]: time="2025-05-27T17:21:19.762014299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:21:19.769724 containerd[2002]: time="2025-05-27T17:21:19.769656727Z" level=info msg="CreateContainer within sandbox \"065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:21:19.789756 containerd[2002]: time="2025-05-27T17:21:19.789517663Z" level=info msg="Container de85abee51805c07990931fa60407475e3d8afcd3a8d748b497ea51435449a5c: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:19.817098 containerd[2002]: time="2025-05-27T17:21:19.817017967Z" level=info msg="CreateContainer within sandbox \"065f3ca7c1b095cd7f940d907976e139880f40f9742bf7cfaaa773cde20afcaf\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"de85abee51805c07990931fa60407475e3d8afcd3a8d748b497ea51435449a5c\"" May 27 17:21:19.818688 containerd[2002]: time="2025-05-27T17:21:19.818596351Z" level=info msg="StartContainer for \"de85abee51805c07990931fa60407475e3d8afcd3a8d748b497ea51435449a5c\"" May 27 17:21:19.822222 containerd[2002]: time="2025-05-27T17:21:19.822101839Z" level=info msg="connecting to shim de85abee51805c07990931fa60407475e3d8afcd3a8d748b497ea51435449a5c" address="unix:///run/containerd/s/c9622e9489ad86e34b894e85910a4f08040fb9fe118165aeb2e70df086bd89ab" protocol=ttrpc version=3 May 27 17:21:19.869528 systemd[1]: Started cri-containerd-de85abee51805c07990931fa60407475e3d8afcd3a8d748b497ea51435449a5c.scope - libcontainer container de85abee51805c07990931fa60407475e3d8afcd3a8d748b497ea51435449a5c. May 27 17:21:19.960353 containerd[2002]: time="2025-05-27T17:21:19.960060644Z" level=info msg="StartContainer for \"de85abee51805c07990931fa60407475e3d8afcd3a8d748b497ea51435449a5c\" returns successfully" May 27 17:21:20.102526 containerd[2002]: time="2025-05-27T17:21:20.101332205Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:20.114530 containerd[2002]: time="2025-05-27T17:21:20.114467825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 17:21:20.119015 containerd[2002]: time="2025-05-27T17:21:20.118955129Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 356.852678ms" May 27 17:21:20.119188 containerd[2002]: time="2025-05-27T17:21:20.119160965Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 17:21:20.121829 containerd[2002]: time="2025-05-27T17:21:20.121770233Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 17:21:20.130953 containerd[2002]: time="2025-05-27T17:21:20.130905485Z" level=info msg="CreateContainer within sandbox \"8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:21:20.147210 containerd[2002]: time="2025-05-27T17:21:20.147158021Z" level=info msg="Container 680d2a812b5ed9b5070b999b4ccb80122291168efdef1eec3d4e09bc0f5572c2: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:20.168968 containerd[2002]: time="2025-05-27T17:21:20.168880433Z" level=info msg="CreateContainer within sandbox \"8c890f1bdc9c9e9bc0201690df94157abcc9fe460c1daac640ad80cef5c09942\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"680d2a812b5ed9b5070b999b4ccb80122291168efdef1eec3d4e09bc0f5572c2\"" May 27 17:21:20.170338 containerd[2002]: time="2025-05-27T17:21:20.170015369Z" level=info msg="StartContainer for \"680d2a812b5ed9b5070b999b4ccb80122291168efdef1eec3d4e09bc0f5572c2\"" May 27 17:21:20.174343 containerd[2002]: time="2025-05-27T17:21:20.174290309Z" level=info msg="connecting to shim 680d2a812b5ed9b5070b999b4ccb80122291168efdef1eec3d4e09bc0f5572c2" address="unix:///run/containerd/s/6812eeb1e9962fd4af231ff4ca800293591a06d95f2b474c2f2c8865eadb1af0" protocol=ttrpc version=3 May 27 17:21:20.220528 systemd[1]: Started cri-containerd-680d2a812b5ed9b5070b999b4ccb80122291168efdef1eec3d4e09bc0f5572c2.scope - libcontainer container 680d2a812b5ed9b5070b999b4ccb80122291168efdef1eec3d4e09bc0f5572c2. May 27 17:21:20.312579 containerd[2002]: time="2025-05-27T17:21:20.312421758Z" level=info msg="StartContainer for \"680d2a812b5ed9b5070b999b4ccb80122291168efdef1eec3d4e09bc0f5572c2\" returns successfully" May 27 17:21:20.479808 kubelet[3467]: I0527 17:21:20.479553 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-58f774c4bf-692vm" podStartSLOduration=33.917180014 podStartE2EDuration="43.479532414s" podCreationTimestamp="2025-05-27 17:20:37 +0000 UTC" firstStartedPulling="2025-05-27 17:21:10.199431055 +0000 UTC m=+52.616310874" lastFinishedPulling="2025-05-27 17:21:19.761783455 +0000 UTC m=+62.178663274" observedRunningTime="2025-05-27 17:21:20.479309586 +0000 UTC m=+62.896189465" watchObservedRunningTime="2025-05-27 17:21:20.479532414 +0000 UTC m=+62.896412233" May 27 17:21:21.257613 systemd[1]: Started sshd@10-172.31.16.30:22-139.178.68.195:38794.service - OpenSSH per-connection server daemon (139.178.68.195:38794). May 27 17:21:21.472593 kubelet[3467]: I0527 17:21:21.472309 3467 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:21:21.473738 kubelet[3467]: I0527 17:21:21.473647 3467 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:21:21.530372 sshd[5654]: Accepted publickey for core from 139.178.68.195 port 38794 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:21.537640 sshd-session[5654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:21.556309 systemd-logind[1976]: New session 11 of user core. May 27 17:21:21.565957 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 17:21:21.820254 containerd[2002]: time="2025-05-27T17:21:21.820077501Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:21.825883 containerd[2002]: time="2025-05-27T17:21:21.825834321Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8226240" May 27 17:21:21.829999 containerd[2002]: time="2025-05-27T17:21:21.829945773Z" level=info msg="ImageCreate event name:\"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:21.838173 containerd[2002]: time="2025-05-27T17:21:21.838113573Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:21.842403 containerd[2002]: time="2025-05-27T17:21:21.841631565Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"9595481\" in 1.719328184s" May 27 17:21:21.842403 containerd[2002]: time="2025-05-27T17:21:21.841697661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\"" May 27 17:21:21.850905 containerd[2002]: time="2025-05-27T17:21:21.849743817Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:21:21.868067 containerd[2002]: time="2025-05-27T17:21:21.867469185Z" level=info msg="CreateContainer within sandbox \"7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 17:21:21.922745 containerd[2002]: time="2025-05-27T17:21:21.922692838Z" level=info msg="Container 1114ba35cebb76acd93dc3ef1f9314d10037e01ca222a82fbe03269a9bdcfc26: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:21.944163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1767156508.mount: Deactivated successfully. May 27 17:21:21.947808 containerd[2002]: time="2025-05-27T17:21:21.947681590Z" level=info msg="CreateContainer within sandbox \"7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"1114ba35cebb76acd93dc3ef1f9314d10037e01ca222a82fbe03269a9bdcfc26\"" May 27 17:21:21.950797 containerd[2002]: time="2025-05-27T17:21:21.950738794Z" level=info msg="StartContainer for \"1114ba35cebb76acd93dc3ef1f9314d10037e01ca222a82fbe03269a9bdcfc26\"" May 27 17:21:21.959650 sshd[5660]: Connection closed by 139.178.68.195 port 38794 May 27 17:21:21.960531 sshd-session[5654]: pam_unix(sshd:session): session closed for user core May 27 17:21:21.970004 systemd[1]: sshd@10-172.31.16.30:22-139.178.68.195:38794.service: Deactivated successfully. May 27 17:21:21.970905 containerd[2002]: time="2025-05-27T17:21:21.969610498Z" level=info msg="connecting to shim 1114ba35cebb76acd93dc3ef1f9314d10037e01ca222a82fbe03269a9bdcfc26" address="unix:///run/containerd/s/6f6075ab2d01141f9f5899655e0c4e9e8da170865c80545badaa35e551e06f5b" protocol=ttrpc version=3 May 27 17:21:21.975604 systemd[1]: session-11.scope: Deactivated successfully. May 27 17:21:21.985345 systemd-logind[1976]: Session 11 logged out. Waiting for processes to exit. May 27 17:21:22.012122 systemd-logind[1976]: Removed session 11. May 27 17:21:22.044997 systemd[1]: Started cri-containerd-1114ba35cebb76acd93dc3ef1f9314d10037e01ca222a82fbe03269a9bdcfc26.scope - libcontainer container 1114ba35cebb76acd93dc3ef1f9314d10037e01ca222a82fbe03269a9bdcfc26. May 27 17:21:22.048359 containerd[2002]: time="2025-05-27T17:21:22.046565538Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:22.051755 containerd[2002]: time="2025-05-27T17:21:22.051669522Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:22.051919 containerd[2002]: time="2025-05-27T17:21:22.051831282Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:21:22.052594 kubelet[3467]: E0527 17:21:22.052472 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:21:22.053744 kubelet[3467]: E0527 17:21:22.052566 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:21:22.053744 kubelet[3467]: E0527 17:21:22.053529 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:5deabb2af4d14acab0df7cb3df816526,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:22.060438 containerd[2002]: time="2025-05-27T17:21:22.060376698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:21:22.233309 containerd[2002]: time="2025-05-27T17:21:22.232733263Z" level=info msg="StartContainer for \"1114ba35cebb76acd93dc3ef1f9314d10037e01ca222a82fbe03269a9bdcfc26\" returns successfully" May 27 17:21:22.340264 containerd[2002]: time="2025-05-27T17:21:22.340102412Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:22.343257 containerd[2002]: time="2025-05-27T17:21:22.342858836Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:22.343257 containerd[2002]: time="2025-05-27T17:21:22.342878132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:21:22.344737 kubelet[3467]: E0527 17:21:22.343279 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:21:22.344737 kubelet[3467]: E0527 17:21:22.343342 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:21:22.344737 kubelet[3467]: E0527 17:21:22.343798 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:22.348007 kubelet[3467]: E0527 17:21:22.345604 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:21:22.357260 containerd[2002]: time="2025-05-27T17:21:22.353786168Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 17:21:24.889630 containerd[2002]: time="2025-05-27T17:21:24.889550064Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:24.892401 containerd[2002]: time="2025-05-27T17:21:24.892341696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=13749925" May 27 17:21:24.894834 containerd[2002]: time="2025-05-27T17:21:24.894770148Z" level=info msg="ImageCreate event name:\"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:24.899171 containerd[2002]: time="2025-05-27T17:21:24.899083752Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:21:24.900448 containerd[2002]: time="2025-05-27T17:21:24.900346332Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"15119118\" in 2.542726752s" May 27 17:21:24.900448 containerd[2002]: time="2025-05-27T17:21:24.900402252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\"" May 27 17:21:24.902781 containerd[2002]: time="2025-05-27T17:21:24.902723232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:21:24.910733 containerd[2002]: time="2025-05-27T17:21:24.910661748Z" level=info msg="CreateContainer within sandbox \"7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 17:21:24.933776 containerd[2002]: time="2025-05-27T17:21:24.932489173Z" level=info msg="Container feb4329851c2ff55f1fbae6355bc94ecec143b5337d956b4d2c42d41dcd1c2a6: CDI devices from CRI Config.CDIDevices: []" May 27 17:21:24.957215 containerd[2002]: time="2025-05-27T17:21:24.957137869Z" level=info msg="CreateContainer within sandbox \"7f270e26aee0e43d5dc5e8f3c539d8be6d3030b6db18d4d3bfe34b757fc72958\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"feb4329851c2ff55f1fbae6355bc94ecec143b5337d956b4d2c42d41dcd1c2a6\"" May 27 17:21:24.958890 containerd[2002]: time="2025-05-27T17:21:24.958632817Z" level=info msg="StartContainer for \"feb4329851c2ff55f1fbae6355bc94ecec143b5337d956b4d2c42d41dcd1c2a6\"" May 27 17:21:24.962127 containerd[2002]: time="2025-05-27T17:21:24.962073577Z" level=info msg="connecting to shim feb4329851c2ff55f1fbae6355bc94ecec143b5337d956b4d2c42d41dcd1c2a6" address="unix:///run/containerd/s/6f6075ab2d01141f9f5899655e0c4e9e8da170865c80545badaa35e551e06f5b" protocol=ttrpc version=3 May 27 17:21:25.007657 systemd[1]: Started cri-containerd-feb4329851c2ff55f1fbae6355bc94ecec143b5337d956b4d2c42d41dcd1c2a6.scope - libcontainer container feb4329851c2ff55f1fbae6355bc94ecec143b5337d956b4d2c42d41dcd1c2a6. May 27 17:21:25.089269 containerd[2002]: time="2025-05-27T17:21:25.088948425Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:25.093660 containerd[2002]: time="2025-05-27T17:21:25.093554817Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:25.094151 containerd[2002]: time="2025-05-27T17:21:25.093736401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:21:25.095012 kubelet[3467]: E0527 17:21:25.094758 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:21:25.095012 kubelet[3467]: E0527 17:21:25.094941 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:21:25.099361 kubelet[3467]: E0527 17:21:25.099209 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkjj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-5lg4z_calico-system(bbacbf79-f2c8-4feb-9c01-4ff16f7741d3): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:25.101466 kubelet[3467]: E0527 17:21:25.101373 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:21:25.103030 containerd[2002]: time="2025-05-27T17:21:25.102950889Z" level=info msg="StartContainer for \"feb4329851c2ff55f1fbae6355bc94ecec143b5337d956b4d2c42d41dcd1c2a6\" returns successfully" May 27 17:21:25.170005 kubelet[3467]: I0527 17:21:25.169722 3467 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 17:21:25.170005 kubelet[3467]: I0527 17:21:25.169786 3467 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 17:21:25.527143 kubelet[3467]: I0527 17:21:25.526934 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-58f774c4bf-tszqg" podStartSLOduration=39.084836604 podStartE2EDuration="48.526911359s" podCreationTimestamp="2025-05-27 17:20:37 +0000 UTC" firstStartedPulling="2025-05-27 17:21:10.67889383 +0000 UTC m=+53.095773649" lastFinishedPulling="2025-05-27 17:21:20.120968573 +0000 UTC m=+62.537848404" observedRunningTime="2025-05-27 17:21:20.508022515 +0000 UTC m=+62.924902358" watchObservedRunningTime="2025-05-27 17:21:25.526911359 +0000 UTC m=+67.943791166" May 27 17:21:26.998552 systemd[1]: Started sshd@11-172.31.16.30:22-139.178.68.195:45618.service - OpenSSH per-connection server daemon (139.178.68.195:45618). May 27 17:21:27.209873 sshd[5745]: Accepted publickey for core from 139.178.68.195 port 45618 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:27.212952 sshd-session[5745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:27.223327 systemd-logind[1976]: New session 12 of user core. May 27 17:21:27.228560 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 17:21:27.509289 sshd[5747]: Connection closed by 139.178.68.195 port 45618 May 27 17:21:27.509017 sshd-session[5745]: pam_unix(sshd:session): session closed for user core May 27 17:21:27.522691 systemd[1]: sshd@11-172.31.16.30:22-139.178.68.195:45618.service: Deactivated successfully. May 27 17:21:27.528037 systemd[1]: session-12.scope: Deactivated successfully. May 27 17:21:27.530193 systemd-logind[1976]: Session 12 logged out. Waiting for processes to exit. May 27 17:21:27.549756 systemd[1]: Started sshd@12-172.31.16.30:22-139.178.68.195:45632.service - OpenSSH per-connection server daemon (139.178.68.195:45632). May 27 17:21:27.552884 systemd-logind[1976]: Removed session 12. May 27 17:21:27.749681 sshd[5767]: Accepted publickey for core from 139.178.68.195 port 45632 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:27.752376 sshd-session[5767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:27.762607 systemd-logind[1976]: New session 13 of user core. May 27 17:21:27.771481 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 17:21:28.128550 sshd[5769]: Connection closed by 139.178.68.195 port 45632 May 27 17:21:28.130119 sshd-session[5767]: pam_unix(sshd:session): session closed for user core May 27 17:21:28.147956 systemd[1]: sshd@12-172.31.16.30:22-139.178.68.195:45632.service: Deactivated successfully. May 27 17:21:28.161111 systemd[1]: session-13.scope: Deactivated successfully. May 27 17:21:28.166888 systemd-logind[1976]: Session 13 logged out. Waiting for processes to exit. May 27 17:21:28.196669 systemd[1]: Started sshd@13-172.31.16.30:22-139.178.68.195:45638.service - OpenSSH per-connection server daemon (139.178.68.195:45638). May 27 17:21:28.202905 systemd-logind[1976]: Removed session 13. May 27 17:21:28.413483 sshd[5779]: Accepted publickey for core from 139.178.68.195 port 45638 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:28.416518 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:28.425626 systemd-logind[1976]: New session 14 of user core. May 27 17:21:28.436505 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 17:21:28.728662 sshd[5782]: Connection closed by 139.178.68.195 port 45638 May 27 17:21:28.729506 sshd-session[5779]: pam_unix(sshd:session): session closed for user core May 27 17:21:28.737217 systemd-logind[1976]: Session 14 logged out. Waiting for processes to exit. May 27 17:21:28.737547 systemd[1]: sshd@13-172.31.16.30:22-139.178.68.195:45638.service: Deactivated successfully. May 27 17:21:28.742403 systemd[1]: session-14.scope: Deactivated successfully. May 27 17:21:28.747917 systemd-logind[1976]: Removed session 14. May 27 17:21:33.766699 systemd[1]: Started sshd@14-172.31.16.30:22-139.178.68.195:55380.service - OpenSSH per-connection server daemon (139.178.68.195:55380). May 27 17:21:33.961251 sshd[5801]: Accepted publickey for core from 139.178.68.195 port 55380 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:33.964013 sshd-session[5801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:33.972368 systemd-logind[1976]: New session 15 of user core. May 27 17:21:33.981508 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 17:21:34.239342 sshd[5803]: Connection closed by 139.178.68.195 port 55380 May 27 17:21:34.240296 sshd-session[5801]: pam_unix(sshd:session): session closed for user core May 27 17:21:34.247770 systemd[1]: sshd@14-172.31.16.30:22-139.178.68.195:55380.service: Deactivated successfully. May 27 17:21:34.251099 systemd[1]: session-15.scope: Deactivated successfully. May 27 17:21:34.254523 systemd-logind[1976]: Session 15 logged out. Waiting for processes to exit. May 27 17:21:34.257936 systemd-logind[1976]: Removed session 15. May 27 17:21:34.954768 kubelet[3467]: E0527 17:21:34.954155 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:21:34.976259 kubelet[3467]: I0527 17:21:34.976014 3467 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5rhxb" podStartSLOduration=33.88236728 podStartE2EDuration="47.975993238s" podCreationTimestamp="2025-05-27 17:20:47 +0000 UTC" firstStartedPulling="2025-05-27 17:21:10.808523158 +0000 UTC m=+53.225402977" lastFinishedPulling="2025-05-27 17:21:24.902149116 +0000 UTC m=+67.319028935" observedRunningTime="2025-05-27 17:21:25.528601271 +0000 UTC m=+67.945481174" watchObservedRunningTime="2025-05-27 17:21:34.975993238 +0000 UTC m=+77.392873057" May 27 17:21:35.445568 containerd[2002]: time="2025-05-27T17:21:35.445481433Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\" id:\"6d8d16cca7e2a0594197885ca72d624d41879071720d1224f3f0fe5176c50a7e\" pid:5827 exit_status:1 exited_at:{seconds:1748366495 nanos:444888621}" May 27 17:21:36.951944 kubelet[3467]: E0527 17:21:36.951857 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:21:37.243093 containerd[2002]: time="2025-05-27T17:21:37.242782126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\" id:\"0edb1e31d056da0a15e0533193f4490c9285f48f22478177f44e1f2e259b5dca\" pid:5852 exited_at:{seconds:1748366497 nanos:242362090}" May 27 17:21:39.278340 systemd[1]: Started sshd@15-172.31.16.30:22-139.178.68.195:55388.service - OpenSSH per-connection server daemon (139.178.68.195:55388). May 27 17:21:39.488953 sshd[5863]: Accepted publickey for core from 139.178.68.195 port 55388 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:39.492806 sshd-session[5863]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:39.502516 systemd-logind[1976]: New session 16 of user core. May 27 17:21:39.510518 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 17:21:39.826977 sshd[5865]: Connection closed by 139.178.68.195 port 55388 May 27 17:21:39.826072 sshd-session[5863]: pam_unix(sshd:session): session closed for user core May 27 17:21:39.832608 systemd[1]: sshd@15-172.31.16.30:22-139.178.68.195:55388.service: Deactivated successfully. May 27 17:21:39.835886 systemd[1]: session-16.scope: Deactivated successfully. May 27 17:21:39.840486 systemd-logind[1976]: Session 16 logged out. Waiting for processes to exit. May 27 17:21:39.844003 systemd-logind[1976]: Removed session 16. May 27 17:21:44.871140 systemd[1]: Started sshd@16-172.31.16.30:22-139.178.68.195:52934.service - OpenSSH per-connection server daemon (139.178.68.195:52934). May 27 17:21:45.074351 sshd[5879]: Accepted publickey for core from 139.178.68.195 port 52934 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:45.077723 sshd-session[5879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:45.090510 systemd-logind[1976]: New session 17 of user core. May 27 17:21:45.101532 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 17:21:45.202071 kubelet[3467]: I0527 17:21:45.201154 3467 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:21:45.442633 sshd[5881]: Connection closed by 139.178.68.195 port 52934 May 27 17:21:45.445875 sshd-session[5879]: pam_unix(sshd:session): session closed for user core May 27 17:21:45.456073 systemd[1]: sshd@16-172.31.16.30:22-139.178.68.195:52934.service: Deactivated successfully. May 27 17:21:45.470931 systemd[1]: session-17.scope: Deactivated successfully. May 27 17:21:45.475674 systemd-logind[1976]: Session 17 logged out. Waiting for processes to exit. May 27 17:21:45.481609 systemd-logind[1976]: Removed session 17. May 27 17:21:45.541880 containerd[2002]: time="2025-05-27T17:21:45.541507303Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\" id:\"8fd6c0f4811236bc8cdd6f68b4da83e0b324ba8a1c4181f82241689c178fb48d\" pid:5905 exited_at:{seconds:1748366505 nanos:540535615}" May 27 17:21:48.521575 kubelet[3467]: I0527 17:21:48.521304 3467 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:21:48.953469 containerd[2002]: time="2025-05-27T17:21:48.953396892Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:21:49.147605 containerd[2002]: time="2025-05-27T17:21:49.147525189Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:49.149956 containerd[2002]: time="2025-05-27T17:21:49.149825817Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:49.150189 containerd[2002]: time="2025-05-27T17:21:49.149877105Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:21:49.150953 kubelet[3467]: E0527 17:21:49.150429 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:21:49.150953 kubelet[3467]: E0527 17:21:49.150496 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:21:49.150953 kubelet[3467]: E0527 17:21:49.150727 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkjj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-5lg4z_calico-system(bbacbf79-f2c8-4feb-9c01-4ff16f7741d3): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:49.152549 kubelet[3467]: E0527 17:21:49.152360 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:21:49.954918 containerd[2002]: time="2025-05-27T17:21:49.954804313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:21:50.153241 containerd[2002]: time="2025-05-27T17:21:50.153150886Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:50.155379 containerd[2002]: time="2025-05-27T17:21:50.155273578Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:50.155379 containerd[2002]: time="2025-05-27T17:21:50.155339302Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:21:50.155676 kubelet[3467]: E0527 17:21:50.155634 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:21:50.156208 kubelet[3467]: E0527 17:21:50.155695 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:21:50.159385 kubelet[3467]: E0527 17:21:50.159195 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:5deabb2af4d14acab0df7cb3df816526,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:50.165869 containerd[2002]: time="2025-05-27T17:21:50.164837962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:21:50.435409 containerd[2002]: time="2025-05-27T17:21:50.435345659Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:21:50.437503 containerd[2002]: time="2025-05-27T17:21:50.437437631Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:21:50.437730 containerd[2002]: time="2025-05-27T17:21:50.437572343Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:21:50.437846 kubelet[3467]: E0527 17:21:50.437765 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:21:50.437846 kubelet[3467]: E0527 17:21:50.437826 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:21:50.438616 kubelet[3467]: E0527 17:21:50.438385 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:21:50.439892 kubelet[3467]: E0527 17:21:50.439787 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:21:50.479950 systemd[1]: Started sshd@17-172.31.16.30:22-139.178.68.195:52946.service - OpenSSH per-connection server daemon (139.178.68.195:52946). May 27 17:21:50.682974 sshd[5926]: Accepted publickey for core from 139.178.68.195 port 52946 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:50.686438 sshd-session[5926]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:50.696479 systemd-logind[1976]: New session 18 of user core. May 27 17:21:50.706470 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 17:21:50.975267 sshd[5928]: Connection closed by 139.178.68.195 port 52946 May 27 17:21:50.976347 sshd-session[5926]: pam_unix(sshd:session): session closed for user core May 27 17:21:50.983537 systemd[1]: sshd@17-172.31.16.30:22-139.178.68.195:52946.service: Deactivated successfully. May 27 17:21:50.989147 systemd[1]: session-18.scope: Deactivated successfully. May 27 17:21:50.993211 systemd-logind[1976]: Session 18 logged out. Waiting for processes to exit. May 27 17:21:50.996680 systemd-logind[1976]: Removed session 18. May 27 17:21:51.020324 systemd[1]: Started sshd@18-172.31.16.30:22-139.178.68.195:52962.service - OpenSSH per-connection server daemon (139.178.68.195:52962). May 27 17:21:51.220286 sshd[5940]: Accepted publickey for core from 139.178.68.195 port 52962 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:51.222983 sshd-session[5940]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:51.231382 systemd-logind[1976]: New session 19 of user core. May 27 17:21:51.238501 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 17:21:51.789092 sshd[5942]: Connection closed by 139.178.68.195 port 52962 May 27 17:21:51.790221 sshd-session[5940]: pam_unix(sshd:session): session closed for user core May 27 17:21:51.797332 systemd[1]: sshd@18-172.31.16.30:22-139.178.68.195:52962.service: Deactivated successfully. May 27 17:21:51.801832 systemd[1]: session-19.scope: Deactivated successfully. May 27 17:21:51.803620 systemd-logind[1976]: Session 19 logged out. Waiting for processes to exit. May 27 17:21:51.807716 systemd-logind[1976]: Removed session 19. May 27 17:21:51.826178 systemd[1]: Started sshd@19-172.31.16.30:22-139.178.68.195:52964.service - OpenSSH per-connection server daemon (139.178.68.195:52964). May 27 17:21:52.037772 sshd[5952]: Accepted publickey for core from 139.178.68.195 port 52964 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:52.040459 sshd-session[5952]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:52.050664 systemd-logind[1976]: New session 20 of user core. May 27 17:21:52.061520 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 17:21:53.503898 sshd[5956]: Connection closed by 139.178.68.195 port 52964 May 27 17:21:53.504888 sshd-session[5952]: pam_unix(sshd:session): session closed for user core May 27 17:21:53.519509 systemd[1]: sshd@19-172.31.16.30:22-139.178.68.195:52964.service: Deactivated successfully. May 27 17:21:53.528023 systemd[1]: session-20.scope: Deactivated successfully. May 27 17:21:53.530575 systemd-logind[1976]: Session 20 logged out. Waiting for processes to exit. May 27 17:21:53.555734 systemd[1]: Started sshd@20-172.31.16.30:22-139.178.68.195:35360.service - OpenSSH per-connection server daemon (139.178.68.195:35360). May 27 17:21:53.560822 systemd-logind[1976]: Removed session 20. May 27 17:21:53.769782 sshd[5970]: Accepted publickey for core from 139.178.68.195 port 35360 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:53.772403 sshd-session[5970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:53.786547 systemd-logind[1976]: New session 21 of user core. May 27 17:21:53.793593 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 17:21:54.379742 sshd[5975]: Connection closed by 139.178.68.195 port 35360 May 27 17:21:54.381582 sshd-session[5970]: pam_unix(sshd:session): session closed for user core May 27 17:21:54.391480 systemd[1]: sshd@20-172.31.16.30:22-139.178.68.195:35360.service: Deactivated successfully. May 27 17:21:54.397081 systemd[1]: session-21.scope: Deactivated successfully. May 27 17:21:54.404911 systemd-logind[1976]: Session 21 logged out. Waiting for processes to exit. May 27 17:21:54.426680 systemd[1]: Started sshd@21-172.31.16.30:22-139.178.68.195:35362.service - OpenSSH per-connection server daemon (139.178.68.195:35362). May 27 17:21:54.435305 systemd-logind[1976]: Removed session 21. May 27 17:21:54.644019 sshd[5987]: Accepted publickey for core from 139.178.68.195 port 35362 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:21:54.647322 sshd-session[5987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:21:54.660846 systemd-logind[1976]: New session 22 of user core. May 27 17:21:54.668049 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 17:21:54.950015 sshd[5989]: Connection closed by 139.178.68.195 port 35362 May 27 17:21:54.951573 sshd-session[5987]: pam_unix(sshd:session): session closed for user core May 27 17:21:54.961403 systemd[1]: sshd@21-172.31.16.30:22-139.178.68.195:35362.service: Deactivated successfully. May 27 17:21:54.970989 systemd[1]: session-22.scope: Deactivated successfully. May 27 17:21:54.974035 systemd-logind[1976]: Session 22 logged out. Waiting for processes to exit. May 27 17:21:54.979870 systemd-logind[1976]: Removed session 22. May 27 17:21:59.988590 systemd[1]: Started sshd@22-172.31.16.30:22-139.178.68.195:35368.service - OpenSSH per-connection server daemon (139.178.68.195:35368). May 27 17:22:00.183758 sshd[6000]: Accepted publickey for core from 139.178.68.195 port 35368 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:22:00.186143 sshd-session[6000]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:22:00.194473 systemd-logind[1976]: New session 23 of user core. May 27 17:22:00.202495 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 17:22:00.449290 sshd[6002]: Connection closed by 139.178.68.195 port 35368 May 27 17:22:00.450107 sshd-session[6000]: pam_unix(sshd:session): session closed for user core May 27 17:22:00.458977 systemd-logind[1976]: Session 23 logged out. Waiting for processes to exit. May 27 17:22:00.459620 systemd[1]: sshd@22-172.31.16.30:22-139.178.68.195:35368.service: Deactivated successfully. May 27 17:22:00.464768 systemd[1]: session-23.scope: Deactivated successfully. May 27 17:22:00.469179 systemd-logind[1976]: Removed session 23. May 27 17:22:03.954603 kubelet[3467]: E0527 17:22:03.954000 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:22:05.424275 containerd[2002]: time="2025-05-27T17:22:05.424161866Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\" id:\"a709d970b2676ffed13176171f6b7601e9a32bceab02b54a9e7d2c666c617829\" pid:6028 exited_at:{seconds:1748366525 nanos:423029018}" May 27 17:22:05.487764 systemd[1]: Started sshd@23-172.31.16.30:22-139.178.68.195:52594.service - OpenSSH per-connection server daemon (139.178.68.195:52594). May 27 17:22:05.561560 update_engine[1978]: I20250527 17:22:05.561475 1978 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 27 17:22:05.561560 update_engine[1978]: I20250527 17:22:05.561554 1978 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 27 17:22:05.562133 update_engine[1978]: I20250527 17:22:05.561888 1978 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.562722 1978 omaha_request_params.cc:62] Current group set to alpha May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.562889 1978 update_attempter.cc:499] Already updated boot flags. Skipping. May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.562909 1978 update_attempter.cc:643] Scheduling an action processor start. May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.562939 1978 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.562996 1978 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.563095 1978 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.563113 1978 omaha_request_action.cc:272] Request: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: May 27 17:22:05.564729 update_engine[1978]: I20250527 17:22:05.563128 1978 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:22:05.570705 locksmithd[2020]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 27 17:22:05.571218 update_engine[1978]: I20250527 17:22:05.570894 1978 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:22:05.571595 update_engine[1978]: I20250527 17:22:05.571498 1978 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:22:05.584528 update_engine[1978]: E20250527 17:22:05.584444 1978 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:22:05.584646 update_engine[1978]: I20250527 17:22:05.584585 1978 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 27 17:22:05.680417 sshd[6040]: Accepted publickey for core from 139.178.68.195 port 52594 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:22:05.683405 sshd-session[6040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:22:05.692742 systemd-logind[1976]: New session 24 of user core. May 27 17:22:05.697528 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 17:22:05.959284 sshd[6042]: Connection closed by 139.178.68.195 port 52594 May 27 17:22:05.958545 sshd-session[6040]: pam_unix(sshd:session): session closed for user core May 27 17:22:05.962993 kubelet[3467]: E0527 17:22:05.960857 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:22:05.972682 systemd[1]: sshd@23-172.31.16.30:22-139.178.68.195:52594.service: Deactivated successfully. May 27 17:22:05.982264 systemd[1]: session-24.scope: Deactivated successfully. May 27 17:22:05.989269 systemd-logind[1976]: Session 24 logged out. Waiting for processes to exit. May 27 17:22:05.994066 systemd-logind[1976]: Removed session 24. May 27 17:22:10.998433 systemd[1]: Started sshd@24-172.31.16.30:22-139.178.68.195:52608.service - OpenSSH per-connection server daemon (139.178.68.195:52608). May 27 17:22:11.208108 sshd[6055]: Accepted publickey for core from 139.178.68.195 port 52608 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:22:11.212509 sshd-session[6055]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:22:11.225963 systemd-logind[1976]: New session 25 of user core. May 27 17:22:11.232911 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 17:22:11.515571 sshd[6057]: Connection closed by 139.178.68.195 port 52608 May 27 17:22:11.516557 sshd-session[6055]: pam_unix(sshd:session): session closed for user core May 27 17:22:11.525459 systemd-logind[1976]: Session 25 logged out. Waiting for processes to exit. May 27 17:22:11.525682 systemd[1]: sshd@24-172.31.16.30:22-139.178.68.195:52608.service: Deactivated successfully. May 27 17:22:11.532205 systemd[1]: session-25.scope: Deactivated successfully. May 27 17:22:11.541326 systemd-logind[1976]: Removed session 25. May 27 17:22:15.537883 containerd[2002]: time="2025-05-27T17:22:15.537687996Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\" id:\"07bd82ca681d37d392172217a7261d909a4d3afa3ee0d01f13a375209b035005\" pid:6080 exited_at:{seconds:1748366535 nanos:536926080}" May 27 17:22:15.561069 update_engine[1978]: I20250527 17:22:15.560281 1978 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:22:15.561069 update_engine[1978]: I20250527 17:22:15.560636 1978 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:22:15.561069 update_engine[1978]: I20250527 17:22:15.560997 1978 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:22:15.562801 update_engine[1978]: E20250527 17:22:15.562633 1978 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:22:15.563154 update_engine[1978]: I20250527 17:22:15.562983 1978 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 27 17:22:16.555728 systemd[1]: Started sshd@25-172.31.16.30:22-139.178.68.195:32926.service - OpenSSH per-connection server daemon (139.178.68.195:32926). May 27 17:22:16.754545 sshd[6090]: Accepted publickey for core from 139.178.68.195 port 32926 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:22:16.759303 sshd-session[6090]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:22:16.770398 systemd-logind[1976]: New session 26 of user core. May 27 17:22:16.782545 systemd[1]: Started session-26.scope - Session 26 of User core. May 27 17:22:17.095127 sshd[6092]: Connection closed by 139.178.68.195 port 32926 May 27 17:22:17.094402 sshd-session[6090]: pam_unix(sshd:session): session closed for user core May 27 17:22:17.104840 systemd[1]: sshd@25-172.31.16.30:22-139.178.68.195:32926.service: Deactivated successfully. May 27 17:22:17.112787 systemd[1]: session-26.scope: Deactivated successfully. May 27 17:22:17.116622 systemd-logind[1976]: Session 26 logged out. Waiting for processes to exit. May 27 17:22:17.121398 systemd-logind[1976]: Removed session 26. May 27 17:22:18.954947 kubelet[3467]: E0527 17:22:18.954797 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:22:18.957607 kubelet[3467]: E0527 17:22:18.957524 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:22:22.131783 systemd[1]: Started sshd@26-172.31.16.30:22-139.178.68.195:32934.service - OpenSSH per-connection server daemon (139.178.68.195:32934). May 27 17:22:22.338829 sshd[6106]: Accepted publickey for core from 139.178.68.195 port 32934 ssh2: RSA SHA256:OTFFtY+IbXW7WNM8sH13bkFJhnpeEXX+zL3Wy28od4E May 27 17:22:22.342030 sshd-session[6106]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:22:22.351913 systemd-logind[1976]: New session 27 of user core. May 27 17:22:22.364671 systemd[1]: Started session-27.scope - Session 27 of User core. May 27 17:22:22.652931 sshd[6108]: Connection closed by 139.178.68.195 port 32934 May 27 17:22:22.653789 sshd-session[6106]: pam_unix(sshd:session): session closed for user core May 27 17:22:22.664435 systemd[1]: sshd@26-172.31.16.30:22-139.178.68.195:32934.service: Deactivated successfully. May 27 17:22:22.669220 systemd[1]: session-27.scope: Deactivated successfully. May 27 17:22:22.672791 systemd-logind[1976]: Session 27 logged out. Waiting for processes to exit. May 27 17:22:22.677563 systemd-logind[1976]: Removed session 27. May 27 17:22:25.558401 update_engine[1978]: I20250527 17:22:25.558298 1978 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:22:25.559072 update_engine[1978]: I20250527 17:22:25.558673 1978 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:22:25.559197 update_engine[1978]: I20250527 17:22:25.559109 1978 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:22:25.560803 update_engine[1978]: E20250527 17:22:25.560729 1978 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:22:25.560935 update_engine[1978]: I20250527 17:22:25.560833 1978 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 27 17:22:30.953099 containerd[2002]: time="2025-05-27T17:22:30.952887076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:22:31.122850 containerd[2002]: time="2025-05-27T17:22:31.122761225Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:22:31.125114 containerd[2002]: time="2025-05-27T17:22:31.125043217Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:22:31.125357 containerd[2002]: time="2025-05-27T17:22:31.125077057Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:22:31.125422 kubelet[3467]: E0527 17:22:31.125379 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:22:31.126135 kubelet[3467]: E0527 17:22:31.125450 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:22:31.126135 kubelet[3467]: E0527 17:22:31.125651 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkjj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-5lg4z_calico-system(bbacbf79-f2c8-4feb-9c01-4ff16f7741d3): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:22:31.127006 kubelet[3467]: E0527 17:22:31.126891 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:22:33.954933 containerd[2002]: time="2025-05-27T17:22:33.954821599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:22:34.181978 containerd[2002]: time="2025-05-27T17:22:34.181760800Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:22:34.184162 containerd[2002]: time="2025-05-27T17:22:34.184000721Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:22:34.184162 containerd[2002]: time="2025-05-27T17:22:34.184124693Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:22:34.184548 kubelet[3467]: E0527 17:22:34.184486 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:22:34.185089 kubelet[3467]: E0527 17:22:34.184560 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:22:34.185089 kubelet[3467]: E0527 17:22:34.184725 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:5deabb2af4d14acab0df7cb3df816526,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:22:34.187931 containerd[2002]: time="2025-05-27T17:22:34.187802141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:22:34.437354 containerd[2002]: time="2025-05-27T17:22:34.437288406Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:22:34.439606 containerd[2002]: time="2025-05-27T17:22:34.439541022Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:22:34.439785 containerd[2002]: time="2025-05-27T17:22:34.439692162Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:22:34.439944 kubelet[3467]: E0527 17:22:34.439872 3467 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:22:34.440088 kubelet[3467]: E0527 17:22:34.439944 3467 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:22:34.440189 kubelet[3467]: E0527 17:22:34.440100 3467 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-678pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79c644867b-j6zz7_calico-system(07c2d869-b946-458d-a95a-719eae16bc54): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:22:34.441459 kubelet[3467]: E0527 17:22:34.441357 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:22:35.425525 containerd[2002]: time="2025-05-27T17:22:35.425420203Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc57aca4ee67e30223eeeb61cb885001e8bda4a9be75ea156911daab274a2278\" id:\"5da6ff508523fc3be2e09992203ffbfc0c329af5688f5686b0b981db1f511e0f\" pid:6143 exited_at:{seconds:1748366555 nanos:424565935}" May 27 17:22:35.565882 update_engine[1978]: I20250527 17:22:35.565788 1978 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:22:35.566424 update_engine[1978]: I20250527 17:22:35.566145 1978 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:22:35.566656 update_engine[1978]: I20250527 17:22:35.566566 1978 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:22:35.567709 update_engine[1978]: E20250527 17:22:35.567655 1978 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:22:35.567806 update_engine[1978]: I20250527 17:22:35.567731 1978 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 17:22:35.567806 update_engine[1978]: I20250527 17:22:35.567754 1978 omaha_request_action.cc:617] Omaha request response: May 27 17:22:35.567912 update_engine[1978]: E20250527 17:22:35.567864 1978 omaha_request_action.cc:636] Omaha request network transfer failed. May 27 17:22:35.567912 update_engine[1978]: I20250527 17:22:35.567900 1978 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 27 17:22:35.568023 update_engine[1978]: I20250527 17:22:35.567915 1978 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 17:22:35.568023 update_engine[1978]: I20250527 17:22:35.567929 1978 update_attempter.cc:306] Processing Done. May 27 17:22:35.568023 update_engine[1978]: E20250527 17:22:35.567955 1978 update_attempter.cc:619] Update failed. May 27 17:22:35.568023 update_engine[1978]: I20250527 17:22:35.567972 1978 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 27 17:22:35.568023 update_engine[1978]: I20250527 17:22:35.567986 1978 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 27 17:22:35.568023 update_engine[1978]: I20250527 17:22:35.568001 1978 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 27 17:22:35.568552 update_engine[1978]: I20250527 17:22:35.568353 1978 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 17:22:35.568552 update_engine[1978]: I20250527 17:22:35.568421 1978 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 17:22:35.568552 update_engine[1978]: I20250527 17:22:35.568441 1978 omaha_request_action.cc:272] Request: May 27 17:22:35.568552 update_engine[1978]: May 27 17:22:35.568552 update_engine[1978]: May 27 17:22:35.568552 update_engine[1978]: May 27 17:22:35.568552 update_engine[1978]: May 27 17:22:35.568552 update_engine[1978]: May 27 17:22:35.568552 update_engine[1978]: May 27 17:22:35.568552 update_engine[1978]: I20250527 17:22:35.568457 1978 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:22:35.569029 locksmithd[2020]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 27 17:22:35.569544 update_engine[1978]: I20250527 17:22:35.568732 1978 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:22:35.569544 update_engine[1978]: I20250527 17:22:35.569334 1978 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:22:35.570304 update_engine[1978]: E20250527 17:22:35.570256 1978 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:22:35.570386 update_engine[1978]: I20250527 17:22:35.570339 1978 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 17:22:35.570386 update_engine[1978]: I20250527 17:22:35.570359 1978 omaha_request_action.cc:617] Omaha request response: May 27 17:22:35.570386 update_engine[1978]: I20250527 17:22:35.570375 1978 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 17:22:35.570593 update_engine[1978]: I20250527 17:22:35.570389 1978 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 17:22:35.570593 update_engine[1978]: I20250527 17:22:35.570403 1978 update_attempter.cc:306] Processing Done. May 27 17:22:35.570593 update_engine[1978]: I20250527 17:22:35.570418 1978 update_attempter.cc:310] Error event sent. May 27 17:22:35.570593 update_engine[1978]: I20250527 17:22:35.570438 1978 update_check_scheduler.cc:74] Next update check in 44m36s May 27 17:22:35.571026 locksmithd[2020]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 27 17:22:36.342929 systemd[1]: cri-containerd-358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6.scope: Deactivated successfully. May 27 17:22:36.343575 systemd[1]: cri-containerd-358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6.scope: Consumed 6.197s CPU time, 61.4M memory peak, 256K read from disk. May 27 17:22:36.351122 containerd[2002]: time="2025-05-27T17:22:36.351034231Z" level=info msg="TaskExit event in podsandbox handler container_id:\"358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6\" id:\"358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6\" pid:3299 exit_status:1 exited_at:{seconds:1748366556 nanos:347684647}" May 27 17:22:36.352404 containerd[2002]: time="2025-05-27T17:22:36.352261915Z" level=info msg="received exit event container_id:\"358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6\" id:\"358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6\" pid:3299 exit_status:1 exited_at:{seconds:1748366556 nanos:347684647}" May 27 17:22:36.409479 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6-rootfs.mount: Deactivated successfully. May 27 17:22:36.426818 systemd[1]: cri-containerd-168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556.scope: Deactivated successfully. May 27 17:22:36.427447 systemd[1]: cri-containerd-168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556.scope: Consumed 21.468s CPU time, 110.2M memory peak. May 27 17:22:36.437571 containerd[2002]: time="2025-05-27T17:22:36.437326796Z" level=info msg="received exit event container_id:\"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\" id:\"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\" pid:3797 exit_status:1 exited_at:{seconds:1748366556 nanos:436512344}" May 27 17:22:36.438538 containerd[2002]: time="2025-05-27T17:22:36.437732180Z" level=info msg="TaskExit event in podsandbox handler container_id:\"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\" id:\"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\" pid:3797 exit_status:1 exited_at:{seconds:1748366556 nanos:436512344}" May 27 17:22:36.481808 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556-rootfs.mount: Deactivated successfully. May 27 17:22:36.763100 kubelet[3467]: I0527 17:22:36.762855 3467 scope.go:117] "RemoveContainer" containerID="168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556" May 27 17:22:36.768948 kubelet[3467]: I0527 17:22:36.768900 3467 scope.go:117] "RemoveContainer" containerID="358f33eb7236eb13b0be134b69f73c3b735a6ea8ec9375cb258b454b7b463ac6" May 27 17:22:36.775563 containerd[2002]: time="2025-05-27T17:22:36.775205325Z" level=info msg="CreateContainer within sandbox \"83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" May 27 17:22:36.776202 containerd[2002]: time="2025-05-27T17:22:36.776106297Z" level=info msg="CreateContainer within sandbox \"3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" May 27 17:22:36.799260 containerd[2002]: time="2025-05-27T17:22:36.798530517Z" level=info msg="Container d46d5282b0c293e6e07346016773c4c553927f66b7381a50333999ca2183f175: CDI devices from CRI Config.CDIDevices: []" May 27 17:22:36.811758 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount911368624.mount: Deactivated successfully. May 27 17:22:36.814544 containerd[2002]: time="2025-05-27T17:22:36.814476814Z" level=info msg="Container ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5: CDI devices from CRI Config.CDIDevices: []" May 27 17:22:36.836411 containerd[2002]: time="2025-05-27T17:22:36.836339986Z" level=info msg="CreateContainer within sandbox \"3163af1c05f0369bde36bdfc0082f7d5b81bfb4122d6c94d3abda12ab1bd0157\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"d46d5282b0c293e6e07346016773c4c553927f66b7381a50333999ca2183f175\"" May 27 17:22:36.837592 containerd[2002]: time="2025-05-27T17:22:36.837515746Z" level=info msg="StartContainer for \"d46d5282b0c293e6e07346016773c4c553927f66b7381a50333999ca2183f175\"" May 27 17:22:36.843145 containerd[2002]: time="2025-05-27T17:22:36.843064966Z" level=info msg="connecting to shim d46d5282b0c293e6e07346016773c4c553927f66b7381a50333999ca2183f175" address="unix:///run/containerd/s/884857095a10245ee19bae6880350278d2ee7a1b8d6a4d48e32cb8e03c7b9311" protocol=ttrpc version=3 May 27 17:22:36.845597 containerd[2002]: time="2025-05-27T17:22:36.845490346Z" level=info msg="CreateContainer within sandbox \"83dc96036e9dc5457340d2afc1212cc6b1f10d6e8741e90ecaec62348ea50197\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5\"" May 27 17:22:36.847265 containerd[2002]: time="2025-05-27T17:22:36.847188178Z" level=info msg="StartContainer for \"ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5\"" May 27 17:22:36.850907 containerd[2002]: time="2025-05-27T17:22:36.850744138Z" level=info msg="connecting to shim ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5" address="unix:///run/containerd/s/59beb8f324b51684c6071245244943b74c246f9f0e5df3bd4267649537e2acd6" protocol=ttrpc version=3 May 27 17:22:36.882890 systemd[1]: Started cri-containerd-d46d5282b0c293e6e07346016773c4c553927f66b7381a50333999ca2183f175.scope - libcontainer container d46d5282b0c293e6e07346016773c4c553927f66b7381a50333999ca2183f175. May 27 17:22:36.906520 systemd[1]: Started cri-containerd-ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5.scope - libcontainer container ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5. May 27 17:22:37.006841 containerd[2002]: time="2025-05-27T17:22:37.005560567Z" level=info msg="StartContainer for \"ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5\" returns successfully" May 27 17:22:37.008492 containerd[2002]: time="2025-05-27T17:22:37.008418499Z" level=info msg="StartContainer for \"d46d5282b0c293e6e07346016773c4c553927f66b7381a50333999ca2183f175\" returns successfully" May 27 17:22:37.251160 containerd[2002]: time="2025-05-27T17:22:37.251030552Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\" id:\"20e24ad8c140ce230512a1c37337ad9d79c5f4b9287fc05d59ddb0f486ecbc89\" pid:6254 exited_at:{seconds:1748366557 nanos:250280312}" May 27 17:22:40.395191 kubelet[3467]: E0527 17:22:40.394682 3467 controller.go:195] "Failed to update lease" err="Put \"https://172.31.16.30:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-16-30?timeout=10s\": context deadline exceeded" May 27 17:22:41.826364 systemd[1]: cri-containerd-d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747.scope: Deactivated successfully. May 27 17:22:41.827678 systemd[1]: cri-containerd-d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747.scope: Consumed 5.284s CPU time, 19.9M memory peak, 64K read from disk. May 27 17:22:41.832047 containerd[2002]: time="2025-05-27T17:22:41.831975710Z" level=info msg="received exit event container_id:\"d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747\" id:\"d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747\" pid:3307 exit_status:1 exited_at:{seconds:1748366561 nanos:831441290}" May 27 17:22:41.832954 containerd[2002]: time="2025-05-27T17:22:41.832397918Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747\" id:\"d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747\" pid:3307 exit_status:1 exited_at:{seconds:1748366561 nanos:831441290}" May 27 17:22:41.875752 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747-rootfs.mount: Deactivated successfully. May 27 17:22:42.803829 kubelet[3467]: I0527 17:22:42.803553 3467 scope.go:117] "RemoveContainer" containerID="d3bf5fb135e0c35cfebd366c59ee5340390003ef10054061df5be39cc26f4747" May 27 17:22:42.808037 containerd[2002]: time="2025-05-27T17:22:42.807984639Z" level=info msg="CreateContainer within sandbox \"87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" May 27 17:22:42.828260 containerd[2002]: time="2025-05-27T17:22:42.826532427Z" level=info msg="Container dad1baf027014371f8d7b6a45754ddb8f70c732e012b16a7421d25ae0b5d30ea: CDI devices from CRI Config.CDIDevices: []" May 27 17:22:42.847705 containerd[2002]: time="2025-05-27T17:22:42.847567792Z" level=info msg="CreateContainer within sandbox \"87ee1565ad0095e8c9a0d16e0c0f8c6850921064d0b5d66bf768fa1adef10c84\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"dad1baf027014371f8d7b6a45754ddb8f70c732e012b16a7421d25ae0b5d30ea\"" May 27 17:22:42.848806 containerd[2002]: time="2025-05-27T17:22:42.848340304Z" level=info msg="StartContainer for \"dad1baf027014371f8d7b6a45754ddb8f70c732e012b16a7421d25ae0b5d30ea\"" May 27 17:22:42.850460 containerd[2002]: time="2025-05-27T17:22:42.850406116Z" level=info msg="connecting to shim dad1baf027014371f8d7b6a45754ddb8f70c732e012b16a7421d25ae0b5d30ea" address="unix:///run/containerd/s/ab5f99fc7ca4794a8b528b2235b7eb2ce93431916c775dcb31a23085b8c98b52" protocol=ttrpc version=3 May 27 17:22:42.894545 systemd[1]: Started cri-containerd-dad1baf027014371f8d7b6a45754ddb8f70c732e012b16a7421d25ae0b5d30ea.scope - libcontainer container dad1baf027014371f8d7b6a45754ddb8f70c732e012b16a7421d25ae0b5d30ea. May 27 17:22:42.974856 containerd[2002]: time="2025-05-27T17:22:42.974799976Z" level=info msg="StartContainer for \"dad1baf027014371f8d7b6a45754ddb8f70c732e012b16a7421d25ae0b5d30ea\" returns successfully" May 27 17:22:45.489569 containerd[2002]: time="2025-05-27T17:22:45.489443213Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ffa656aa7f5a9f1ba7a3f9395a1e3593352a8319f7a481b1325981873d3080e7\" id:\"34f1968b08361f8aff6293ddd1497b5cb16bd966ccd69da48f56f39954572b86\" pid:6346 exit_status:1 exited_at:{seconds:1748366565 nanos:488890013}" May 27 17:22:45.952678 kubelet[3467]: E0527 17:22:45.952465 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-5lg4z" podUID="bbacbf79-f2c8-4feb-9c01-4ff16f7741d3" May 27 17:22:48.482766 systemd[1]: cri-containerd-ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5.scope: Deactivated successfully. May 27 17:22:48.486725 containerd[2002]: time="2025-05-27T17:22:48.484010420Z" level=info msg="received exit event container_id:\"ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5\" id:\"ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5\" pid:6211 exit_status:1 exited_at:{seconds:1748366568 nanos:483591164}" May 27 17:22:48.486725 containerd[2002]: time="2025-05-27T17:22:48.484121624Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5\" id:\"ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5\" pid:6211 exit_status:1 exited_at:{seconds:1748366568 nanos:483591164}" May 27 17:22:48.524737 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5-rootfs.mount: Deactivated successfully. May 27 17:22:48.831353 kubelet[3467]: I0527 17:22:48.831310 3467 scope.go:117] "RemoveContainer" containerID="168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556" May 27 17:22:48.832012 kubelet[3467]: I0527 17:22:48.831842 3467 scope.go:117] "RemoveContainer" containerID="ff3acd236fbcba41167787533f066e11504837a69061bdad6e7f8151af804eb5" May 27 17:22:48.832125 kubelet[3467]: E0527 17:22:48.832062 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-844669ff44-ztkbd_tigera-operator(5c044ac0-3299-4de0-bd19-31727d286198)\"" pod="tigera-operator/tigera-operator-844669ff44-ztkbd" podUID="5c044ac0-3299-4de0-bd19-31727d286198" May 27 17:22:48.836263 containerd[2002]: time="2025-05-27T17:22:48.836178981Z" level=info msg="RemoveContainer for \"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\"" May 27 17:22:48.845779 containerd[2002]: time="2025-05-27T17:22:48.845698917Z" level=info msg="RemoveContainer for \"168079267f9ebb1b2a5316a0d443e58036fdae73425068d49b402d48c439d556\" returns successfully" May 27 17:22:48.952885 kubelet[3467]: E0527 17:22:48.952752 3467 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79c644867b-j6zz7" podUID="07c2d869-b946-458d-a95a-719eae16bc54" May 27 17:22:50.396159 kubelet[3467]: E0527 17:22:50.395940 3467 controller.go:195] "Failed to update lease" err="Put \"https://172.31.16.30:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-16-30?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"