May 27 02:47:45.099168 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] May 27 02:47:45.099213 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 27 01:20:04 -00 2025 May 27 02:47:45.099237 kernel: KASLR disabled due to lack of seed May 27 02:47:45.099254 kernel: efi: EFI v2.7 by EDK II May 27 02:47:45.099270 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7a733a98 MEMRESERVE=0x78551598 May 27 02:47:45.099285 kernel: secureboot: Secure boot disabled May 27 02:47:45.099303 kernel: ACPI: Early table checksum verification disabled May 27 02:47:45.099319 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) May 27 02:47:45.099335 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) May 27 02:47:45.099350 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) May 27 02:47:45.099371 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) May 27 02:47:45.099387 kernel: ACPI: FACS 0x0000000078630000 000040 May 27 02:47:45.099402 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) May 27 02:47:45.099418 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) May 27 02:47:45.099436 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) May 27 02:47:45.099453 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) May 27 02:47:45.099474 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) May 27 02:47:45.099491 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) May 27 02:47:45.099507 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) May 27 02:47:45.099524 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 May 27 02:47:45.099540 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') May 27 02:47:45.099558 kernel: printk: legacy bootconsole [uart0] enabled May 27 02:47:45.099575 kernel: ACPI: Use ACPI SPCR as default console: Yes May 27 02:47:45.099591 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] May 27 02:47:45.099608 kernel: NODE_DATA(0) allocated [mem 0x4b584cdc0-0x4b5853fff] May 27 02:47:45.099624 kernel: Zone ranges: May 27 02:47:45.099645 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] May 27 02:47:45.099661 kernel: DMA32 empty May 27 02:47:45.099677 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] May 27 02:47:45.099693 kernel: Device empty May 27 02:47:45.099709 kernel: Movable zone start for each node May 27 02:47:45.099725 kernel: Early memory node ranges May 27 02:47:45.099742 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] May 27 02:47:45.099758 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] May 27 02:47:45.099774 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] May 27 02:47:45.099790 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] May 27 02:47:45.099807 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] May 27 02:47:45.099823 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] May 27 02:47:45.099845 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] May 27 02:47:45.099862 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] May 27 02:47:45.099885 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] May 27 02:47:45.099902 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges May 27 02:47:45.099920 kernel: psci: probing for conduit method from ACPI. May 27 02:47:45.099941 kernel: psci: PSCIv1.0 detected in firmware. May 27 02:47:45.099958 kernel: psci: Using standard PSCI v0.2 function IDs May 27 02:47:45.099975 kernel: psci: Trusted OS migration not required May 27 02:47:45.099994 kernel: psci: SMC Calling Convention v1.1 May 27 02:47:45.102110 kernel: percpu: Embedded 33 pages/cpu s98136 r8192 d28840 u135168 May 27 02:47:45.102142 kernel: pcpu-alloc: s98136 r8192 d28840 u135168 alloc=33*4096 May 27 02:47:45.102160 kernel: pcpu-alloc: [0] 0 [0] 1 May 27 02:47:45.102178 kernel: Detected PIPT I-cache on CPU0 May 27 02:47:45.102195 kernel: CPU features: detected: GIC system register CPU interface May 27 02:47:45.102212 kernel: CPU features: detected: Spectre-v2 May 27 02:47:45.102230 kernel: CPU features: detected: Spectre-v3a May 27 02:47:45.102247 kernel: CPU features: detected: Spectre-BHB May 27 02:47:45.102271 kernel: CPU features: detected: ARM erratum 1742098 May 27 02:47:45.102289 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 May 27 02:47:45.102306 kernel: alternatives: applying boot alternatives May 27 02:47:45.102327 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=4c3f98aae7a61b3dcbab6391ba922461adab29dbcb79fd6e18169f93c5a4ab5a May 27 02:47:45.102346 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 02:47:45.102364 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 02:47:45.102382 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 02:47:45.102399 kernel: Fallback order for Node 0: 0 May 27 02:47:45.102417 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1007616 May 27 02:47:45.102434 kernel: Policy zone: Normal May 27 02:47:45.102455 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 02:47:45.102472 kernel: software IO TLB: area num 2. May 27 02:47:45.102505 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) May 27 02:47:45.102531 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 27 02:47:45.102551 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 02:47:45.102569 kernel: rcu: RCU event tracing is enabled. May 27 02:47:45.102588 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 27 02:47:45.102607 kernel: Trampoline variant of Tasks RCU enabled. May 27 02:47:45.102625 kernel: Tracing variant of Tasks RCU enabled. May 27 02:47:45.102643 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 02:47:45.102661 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 27 02:47:45.102679 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 02:47:45.102706 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 02:47:45.102725 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 27 02:47:45.102743 kernel: GICv3: 96 SPIs implemented May 27 02:47:45.102761 kernel: GICv3: 0 Extended SPIs implemented May 27 02:47:45.102778 kernel: Root IRQ handler: gic_handle_irq May 27 02:47:45.102797 kernel: GICv3: GICv3 features: 16 PPIs May 27 02:47:45.102814 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 May 27 02:47:45.102832 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 May 27 02:47:45.102849 kernel: ITS [mem 0x10080000-0x1009ffff] May 27 02:47:45.102868 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000c0000 (indirect, esz 8, psz 64K, shr 1) May 27 02:47:45.102887 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000d0000 (flat, esz 8, psz 64K, shr 1) May 27 02:47:45.102911 kernel: GICv3: using LPI property table @0x00000004000e0000 May 27 02:47:45.102928 kernel: ITS: Using hypervisor restricted LPI range [128] May 27 02:47:45.102946 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000f0000 May 27 02:47:45.102964 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 02:47:45.102982 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). May 27 02:47:45.103000 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns May 27 02:47:45.106058 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns May 27 02:47:45.106100 kernel: Console: colour dummy device 80x25 May 27 02:47:45.106119 kernel: printk: legacy console [tty1] enabled May 27 02:47:45.106137 kernel: ACPI: Core revision 20240827 May 27 02:47:45.106155 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) May 27 02:47:45.106184 kernel: pid_max: default: 32768 minimum: 301 May 27 02:47:45.106202 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 02:47:45.106219 kernel: landlock: Up and running. May 27 02:47:45.106237 kernel: SELinux: Initializing. May 27 02:47:45.106255 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 02:47:45.106273 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 02:47:45.106647 kernel: rcu: Hierarchical SRCU implementation. May 27 02:47:45.106683 kernel: rcu: Max phase no-delay instances is 400. May 27 02:47:45.106702 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 02:47:45.106728 kernel: Remapping and enabling EFI services. May 27 02:47:45.106745 kernel: smp: Bringing up secondary CPUs ... May 27 02:47:45.106763 kernel: Detected PIPT I-cache on CPU1 May 27 02:47:45.106781 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 May 27 02:47:45.106798 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000400100000 May 27 02:47:45.106816 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] May 27 02:47:45.106834 kernel: smp: Brought up 1 node, 2 CPUs May 27 02:47:45.106852 kernel: SMP: Total of 2 processors activated. May 27 02:47:45.106870 kernel: CPU: All CPU(s) started at EL1 May 27 02:47:45.106911 kernel: CPU features: detected: 32-bit EL0 Support May 27 02:47:45.106944 kernel: CPU features: detected: 32-bit EL1 Support May 27 02:47:45.106963 kernel: CPU features: detected: CRC32 instructions May 27 02:47:45.106986 kernel: alternatives: applying system-wide alternatives May 27 02:47:45.107005 kernel: Memory: 3813536K/4030464K available (11072K kernel code, 2276K rwdata, 8936K rodata, 39424K init, 1034K bss, 212156K reserved, 0K cma-reserved) May 27 02:47:45.107199 kernel: devtmpfs: initialized May 27 02:47:45.107221 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 02:47:45.107240 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 27 02:47:45.107267 kernel: 17024 pages in range for non-PLT usage May 27 02:47:45.107288 kernel: 508544 pages in range for PLT usage May 27 02:47:45.107307 kernel: pinctrl core: initialized pinctrl subsystem May 27 02:47:45.107327 kernel: SMBIOS 3.0.0 present. May 27 02:47:45.107346 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 May 27 02:47:45.107365 kernel: DMI: Memory slots populated: 0/0 May 27 02:47:45.107384 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 02:47:45.107402 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 27 02:47:45.107421 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 27 02:47:45.107444 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 27 02:47:45.107463 kernel: audit: initializing netlink subsys (disabled) May 27 02:47:45.107482 kernel: audit: type=2000 audit(0.227:1): state=initialized audit_enabled=0 res=1 May 27 02:47:45.107501 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 02:47:45.107519 kernel: cpuidle: using governor menu May 27 02:47:45.107537 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 27 02:47:45.107556 kernel: ASID allocator initialised with 65536 entries May 27 02:47:45.107575 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 02:47:45.107597 kernel: Serial: AMBA PL011 UART driver May 27 02:47:45.107617 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 02:47:45.107635 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 27 02:47:45.107653 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 27 02:47:45.107672 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 27 02:47:45.107690 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 02:47:45.107709 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 27 02:47:45.107728 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 27 02:47:45.107746 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 27 02:47:45.107768 kernel: ACPI: Added _OSI(Module Device) May 27 02:47:45.107787 kernel: ACPI: Added _OSI(Processor Device) May 27 02:47:45.107806 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 02:47:45.107824 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 02:47:45.107842 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 02:47:45.107860 kernel: ACPI: Interpreter enabled May 27 02:47:45.107879 kernel: ACPI: Using GIC for interrupt routing May 27 02:47:45.107898 kernel: ACPI: MCFG table detected, 1 entries May 27 02:47:45.107916 kernel: ACPI: CPU0 has been hot-added May 27 02:47:45.107934 kernel: ACPI: CPU1 has been hot-added May 27 02:47:45.107957 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) May 27 02:47:45.110403 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 02:47:45.110623 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 27 02:47:45.110822 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 27 02:47:45.112059 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 May 27 02:47:45.114433 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] May 27 02:47:45.114469 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] May 27 02:47:45.114499 kernel: acpiphp: Slot [1] registered May 27 02:47:45.114519 kernel: acpiphp: Slot [2] registered May 27 02:47:45.114537 kernel: acpiphp: Slot [3] registered May 27 02:47:45.114556 kernel: acpiphp: Slot [4] registered May 27 02:47:45.114575 kernel: acpiphp: Slot [5] registered May 27 02:47:45.114593 kernel: acpiphp: Slot [6] registered May 27 02:47:45.114612 kernel: acpiphp: Slot [7] registered May 27 02:47:45.114631 kernel: acpiphp: Slot [8] registered May 27 02:47:45.114649 kernel: acpiphp: Slot [9] registered May 27 02:47:45.114671 kernel: acpiphp: Slot [10] registered May 27 02:47:45.114690 kernel: acpiphp: Slot [11] registered May 27 02:47:45.114708 kernel: acpiphp: Slot [12] registered May 27 02:47:45.114726 kernel: acpiphp: Slot [13] registered May 27 02:47:45.114744 kernel: acpiphp: Slot [14] registered May 27 02:47:45.114762 kernel: acpiphp: Slot [15] registered May 27 02:47:45.114781 kernel: acpiphp: Slot [16] registered May 27 02:47:45.114799 kernel: acpiphp: Slot [17] registered May 27 02:47:45.114817 kernel: acpiphp: Slot [18] registered May 27 02:47:45.114835 kernel: acpiphp: Slot [19] registered May 27 02:47:45.114857 kernel: acpiphp: Slot [20] registered May 27 02:47:45.114875 kernel: acpiphp: Slot [21] registered May 27 02:47:45.114893 kernel: acpiphp: Slot [22] registered May 27 02:47:45.114911 kernel: acpiphp: Slot [23] registered May 27 02:47:45.114929 kernel: acpiphp: Slot [24] registered May 27 02:47:45.114947 kernel: acpiphp: Slot [25] registered May 27 02:47:45.114966 kernel: acpiphp: Slot [26] registered May 27 02:47:45.114984 kernel: acpiphp: Slot [27] registered May 27 02:47:45.115002 kernel: acpiphp: Slot [28] registered May 27 02:47:45.115056 kernel: acpiphp: Slot [29] registered May 27 02:47:45.115076 kernel: acpiphp: Slot [30] registered May 27 02:47:45.115095 kernel: acpiphp: Slot [31] registered May 27 02:47:45.115113 kernel: PCI host bridge to bus 0000:00 May 27 02:47:45.115320 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] May 27 02:47:45.115509 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] May 27 02:47:45.115685 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] May 27 02:47:45.115859 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] May 27 02:47:45.119275 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 conventional PCI endpoint May 27 02:47:45.119549 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 conventional PCI endpoint May 27 02:47:45.119767 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff] May 27 02:47:45.120002 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 PCIe Root Complex Integrated Endpoint May 27 02:47:45.120248 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80114000-0x80117fff] May 27 02:47:45.120452 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 02:47:45.120720 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 PCIe Root Complex Integrated Endpoint May 27 02:47:45.120928 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80110000-0x80113fff] May 27 02:47:45.125281 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref] May 27 02:47:45.125516 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff] May 27 02:47:45.125724 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 02:47:45.125928 kernel: pci 0000:00:05.0: BAR 2 [mem 0x80000000-0x800fffff pref]: assigned May 27 02:47:45.126176 kernel: pci 0000:00:05.0: BAR 4 [mem 0x80100000-0x8010ffff]: assigned May 27 02:47:45.126397 kernel: pci 0000:00:04.0: BAR 0 [mem 0x80110000-0x80113fff]: assigned May 27 02:47:45.126597 kernel: pci 0000:00:05.0: BAR 0 [mem 0x80114000-0x80117fff]: assigned May 27 02:47:45.126802 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80118000-0x80118fff]: assigned May 27 02:47:45.127416 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] May 27 02:47:45.127615 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] May 27 02:47:45.127792 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] May 27 02:47:45.127818 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 May 27 02:47:45.128618 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 May 27 02:47:45.128640 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 May 27 02:47:45.128682 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 May 27 02:47:45.128702 kernel: iommu: Default domain type: Translated May 27 02:47:45.128722 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 27 02:47:45.128740 kernel: efivars: Registered efivars operations May 27 02:47:45.128759 kernel: vgaarb: loaded May 27 02:47:45.128777 kernel: clocksource: Switched to clocksource arch_sys_counter May 27 02:47:45.128795 kernel: VFS: Disk quotas dquot_6.6.0 May 27 02:47:45.128822 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 02:47:45.128841 kernel: pnp: PnP ACPI init May 27 02:47:45.129157 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved May 27 02:47:45.129189 kernel: pnp: PnP ACPI: found 1 devices May 27 02:47:45.129208 kernel: NET: Registered PF_INET protocol family May 27 02:47:45.129227 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 27 02:47:45.129246 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 27 02:47:45.129264 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 02:47:45.129289 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 02:47:45.129308 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 27 02:47:45.129327 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 27 02:47:45.129345 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 02:47:45.129364 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 02:47:45.129382 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 02:47:45.129400 kernel: PCI: CLS 0 bytes, default 64 May 27 02:47:45.129419 kernel: kvm [1]: HYP mode not available May 27 02:47:45.129437 kernel: Initialise system trusted keyrings May 27 02:47:45.129460 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 27 02:47:45.129478 kernel: Key type asymmetric registered May 27 02:47:45.129496 kernel: Asymmetric key parser 'x509' registered May 27 02:47:45.129514 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) May 27 02:47:45.129533 kernel: io scheduler mq-deadline registered May 27 02:47:45.129552 kernel: io scheduler kyber registered May 27 02:47:45.129570 kernel: io scheduler bfq registered May 27 02:47:45.129778 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered May 27 02:47:45.129807 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 27 02:47:45.129831 kernel: ACPI: button: Power Button [PWRB] May 27 02:47:45.129850 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 May 27 02:47:45.129868 kernel: ACPI: button: Sleep Button [SLPB] May 27 02:47:45.129887 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 02:47:45.129906 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 May 27 02:47:45.130175 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) May 27 02:47:45.130205 kernel: printk: legacy console [ttyS0] disabled May 27 02:47:45.130225 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A May 27 02:47:45.130250 kernel: printk: legacy console [ttyS0] enabled May 27 02:47:45.130270 kernel: printk: legacy bootconsole [uart0] disabled May 27 02:47:45.130288 kernel: thunder_xcv, ver 1.0 May 27 02:47:45.130306 kernel: thunder_bgx, ver 1.0 May 27 02:47:45.130325 kernel: nicpf, ver 1.0 May 27 02:47:45.130343 kernel: nicvf, ver 1.0 May 27 02:47:45.130569 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 27 02:47:45.130762 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-27T02:47:44 UTC (1748314064) May 27 02:47:45.130787 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 02:47:45.130812 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 (0,80000003) counters available May 27 02:47:45.130831 kernel: NET: Registered PF_INET6 protocol family May 27 02:47:45.130849 kernel: watchdog: NMI not fully supported May 27 02:47:45.130868 kernel: watchdog: Hard watchdog permanently disabled May 27 02:47:45.130886 kernel: Segment Routing with IPv6 May 27 02:47:45.130904 kernel: In-situ OAM (IOAM) with IPv6 May 27 02:47:45.130922 kernel: NET: Registered PF_PACKET protocol family May 27 02:47:45.130941 kernel: Key type dns_resolver registered May 27 02:47:45.130959 kernel: registered taskstats version 1 May 27 02:47:45.130983 kernel: Loading compiled-in X.509 certificates May 27 02:47:45.131002 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 6bbf5412ef1f8a32378a640b6d048f74e6d74df0' May 27 02:47:45.131052 kernel: Demotion targets for Node 0: null May 27 02:47:45.131073 kernel: Key type .fscrypt registered May 27 02:47:45.131091 kernel: Key type fscrypt-provisioning registered May 27 02:47:45.131109 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 02:47:45.131128 kernel: ima: Allocated hash algorithm: sha1 May 27 02:47:45.131147 kernel: ima: No architecture policies found May 27 02:47:45.131165 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 27 02:47:45.131192 kernel: clk: Disabling unused clocks May 27 02:47:45.134498 kernel: PM: genpd: Disabling unused power domains May 27 02:47:45.134535 kernel: Warning: unable to open an initial console. May 27 02:47:45.134555 kernel: Freeing unused kernel memory: 39424K May 27 02:47:45.134574 kernel: Run /init as init process May 27 02:47:45.134594 kernel: with arguments: May 27 02:47:45.134613 kernel: /init May 27 02:47:45.134632 kernel: with environment: May 27 02:47:45.134652 kernel: HOME=/ May 27 02:47:45.134682 kernel: TERM=linux May 27 02:47:45.134702 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 02:47:45.134723 systemd[1]: Successfully made /usr/ read-only. May 27 02:47:45.134750 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 02:47:45.134772 systemd[1]: Detected virtualization amazon. May 27 02:47:45.134792 systemd[1]: Detected architecture arm64. May 27 02:47:45.134812 systemd[1]: Running in initrd. May 27 02:47:45.134836 systemd[1]: No hostname configured, using default hostname. May 27 02:47:45.134858 systemd[1]: Hostname set to . May 27 02:47:45.134878 systemd[1]: Initializing machine ID from VM UUID. May 27 02:47:45.134898 systemd[1]: Queued start job for default target initrd.target. May 27 02:47:45.134918 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 02:47:45.134939 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 02:47:45.134960 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 02:47:45.134981 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 02:47:45.135007 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 02:47:45.135128 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 02:47:45.135153 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 02:47:45.135174 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 02:47:45.135195 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 02:47:45.135216 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 02:47:45.135236 systemd[1]: Reached target paths.target - Path Units. May 27 02:47:45.135263 systemd[1]: Reached target slices.target - Slice Units. May 27 02:47:45.135286 systemd[1]: Reached target swap.target - Swaps. May 27 02:47:45.135306 systemd[1]: Reached target timers.target - Timer Units. May 27 02:47:45.135327 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 02:47:45.135348 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 02:47:45.135369 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 02:47:45.135389 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 02:47:45.135409 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 02:47:45.135430 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 02:47:45.135455 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 02:47:45.135475 systemd[1]: Reached target sockets.target - Socket Units. May 27 02:47:45.135495 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 02:47:45.135515 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 02:47:45.135536 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 02:47:45.135557 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 02:47:45.135577 systemd[1]: Starting systemd-fsck-usr.service... May 27 02:47:45.135597 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 02:47:45.135622 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 02:47:45.135643 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:45.135663 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 02:47:45.135685 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 02:47:45.135705 systemd[1]: Finished systemd-fsck-usr.service. May 27 02:47:45.135730 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 02:47:45.135805 systemd-journald[258]: Collecting audit messages is disabled. May 27 02:47:45.135851 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:45.135879 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 02:47:45.135900 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 02:47:45.135934 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 02:47:45.135959 kernel: Bridge firewalling registered May 27 02:47:45.135980 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 02:47:45.136001 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 02:47:45.136053 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 02:47:45.136093 systemd-journald[258]: Journal started May 27 02:47:45.136133 systemd-journald[258]: Runtime Journal (/run/log/journal/ec258f6f6144f5b6ab245ebe614e9d55) is 8M, max 75.3M, 67.3M free. May 27 02:47:45.063094 systemd-modules-load[259]: Inserted module 'overlay' May 27 02:47:45.106391 systemd-modules-load[259]: Inserted module 'br_netfilter' May 27 02:47:45.143880 systemd[1]: Started systemd-journald.service - Journal Service. May 27 02:47:45.159137 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 02:47:45.166194 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 02:47:45.188722 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 02:47:45.192388 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 02:47:45.202759 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 02:47:45.208136 systemd-tmpfiles[290]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 02:47:45.224397 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 02:47:45.236797 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 02:47:45.263527 dracut-cmdline[297]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=4c3f98aae7a61b3dcbab6391ba922461adab29dbcb79fd6e18169f93c5a4ab5a May 27 02:47:45.335032 systemd-resolved[299]: Positive Trust Anchors: May 27 02:47:45.335066 systemd-resolved[299]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 02:47:45.335130 systemd-resolved[299]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 02:47:45.421052 kernel: SCSI subsystem initialized May 27 02:47:45.429050 kernel: Loading iSCSI transport class v2.0-870. May 27 02:47:45.441300 kernel: iscsi: registered transport (tcp) May 27 02:47:45.462452 kernel: iscsi: registered transport (qla4xxx) May 27 02:47:45.462526 kernel: QLogic iSCSI HBA Driver May 27 02:47:45.495178 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 02:47:45.527241 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 02:47:45.536024 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 02:47:45.598049 kernel: random: crng init done May 27 02:47:45.598332 systemd-resolved[299]: Defaulting to hostname 'linux'. May 27 02:47:45.601775 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 02:47:45.604098 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 02:47:45.627945 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 02:47:45.633943 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 02:47:45.720061 kernel: raid6: neonx8 gen() 6559 MB/s May 27 02:47:45.737043 kernel: raid6: neonx4 gen() 6597 MB/s May 27 02:47:45.754040 kernel: raid6: neonx2 gen() 5447 MB/s May 27 02:47:45.771039 kernel: raid6: neonx1 gen() 3924 MB/s May 27 02:47:45.788042 kernel: raid6: int64x8 gen() 3641 MB/s May 27 02:47:45.805043 kernel: raid6: int64x4 gen() 3700 MB/s May 27 02:47:45.822039 kernel: raid6: int64x2 gen() 3594 MB/s May 27 02:47:45.839856 kernel: raid6: int64x1 gen() 2745 MB/s May 27 02:47:45.839888 kernel: raid6: using algorithm neonx4 gen() 6597 MB/s May 27 02:47:45.857840 kernel: raid6: .... xor() 4571 MB/s, rmw enabled May 27 02:47:45.857882 kernel: raid6: using neon recovery algorithm May 27 02:47:45.865046 kernel: xor: measuring software checksum speed May 27 02:47:45.866043 kernel: 8regs : 11889 MB/sec May 27 02:47:45.868266 kernel: 32regs : 12007 MB/sec May 27 02:47:45.868298 kernel: arm64_neon : 9362 MB/sec May 27 02:47:45.868323 kernel: xor: using function: 32regs (12007 MB/sec) May 27 02:47:45.961056 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 02:47:45.972053 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 02:47:45.978689 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 02:47:46.027449 systemd-udevd[507]: Using default interface naming scheme 'v255'. May 27 02:47:46.039263 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 02:47:46.046775 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 02:47:46.092535 dracut-pre-trigger[513]: rd.md=0: removing MD RAID activation May 27 02:47:46.136429 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 02:47:46.141990 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 02:47:46.280059 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 02:47:46.288396 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 02:47:46.432676 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 May 27 02:47:46.432765 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) May 27 02:47:46.440257 kernel: ena 0000:00:05.0: ENA device version: 0.10 May 27 02:47:46.440595 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 May 27 02:47:46.456047 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:0f:74:6e:42:81 May 27 02:47:46.460420 (udev-worker)[552]: Network interface NamePolicy= disabled on kernel command line. May 27 02:47:46.467060 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 May 27 02:47:46.467126 kernel: nvme nvme0: pci function 0000:00:04.0 May 27 02:47:46.479050 kernel: nvme nvme0: 2/0/0 default/read/poll queues May 27 02:47:46.482265 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 02:47:46.484325 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:46.489165 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:46.498387 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 02:47:46.498439 kernel: GPT:9289727 != 16777215 May 27 02:47:46.498466 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 02:47:46.498491 kernel: GPT:9289727 != 16777215 May 27 02:47:46.498514 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 02:47:46.498538 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 02:47:46.500501 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:46.505055 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 02:47:46.544247 kernel: nvme nvme0: using unchecked data buffer May 27 02:47:46.564725 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:46.663295 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. May 27 02:47:46.736152 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. May 27 02:47:46.741230 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 02:47:46.764336 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. May 27 02:47:46.766983 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. May 27 02:47:46.807244 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. May 27 02:47:46.812566 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 02:47:46.815734 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 02:47:46.822255 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 02:47:46.827248 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 02:47:46.832966 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 02:47:46.860510 disk-uuid[687]: Primary Header is updated. May 27 02:47:46.860510 disk-uuid[687]: Secondary Entries is updated. May 27 02:47:46.860510 disk-uuid[687]: Secondary Header is updated. May 27 02:47:46.873070 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 02:47:46.878294 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 02:47:46.887058 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 02:47:47.890259 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 02:47:47.891572 disk-uuid[688]: The operation has completed successfully. May 27 02:47:48.071123 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 02:47:48.071300 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 02:47:48.619309 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 02:47:48.656035 sh[954]: Success May 27 02:47:48.684211 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 02:47:48.684286 kernel: device-mapper: uevent: version 1.0.3 May 27 02:47:48.686131 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 02:47:48.700054 kernel: device-mapper: verity: sha256 using shash "sha256-ce" May 27 02:47:48.808389 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 02:47:48.815658 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 02:47:48.838450 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 02:47:48.858080 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 02:47:48.858145 kernel: BTRFS: device fsid 5c6341ea-4eb5-44b6-ac57-c4d29847e384 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (978) May 27 02:47:48.864331 kernel: BTRFS info (device dm-0): first mount of filesystem 5c6341ea-4eb5-44b6-ac57-c4d29847e384 May 27 02:47:48.864396 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:48.864422 kernel: BTRFS info (device dm-0): using free-space-tree May 27 02:47:48.891784 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 02:47:48.895431 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 02:47:48.899672 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 02:47:48.903227 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 02:47:48.913253 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 02:47:48.963074 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1011) May 27 02:47:48.966900 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:48.966962 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:48.968206 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 02:47:48.992083 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:48.994499 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 02:47:49.000945 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 02:47:49.089906 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 02:47:49.096049 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 02:47:49.164211 systemd-networkd[1147]: lo: Link UP May 27 02:47:49.164232 systemd-networkd[1147]: lo: Gained carrier May 27 02:47:49.169492 systemd-networkd[1147]: Enumeration completed May 27 02:47:49.171192 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 02:47:49.171608 systemd-networkd[1147]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:49.171615 systemd-networkd[1147]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 02:47:49.183470 systemd[1]: Reached target network.target - Network. May 27 02:47:49.187857 systemd-networkd[1147]: eth0: Link UP May 27 02:47:49.189069 systemd-networkd[1147]: eth0: Gained carrier May 27 02:47:49.189226 systemd-networkd[1147]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:49.216118 systemd-networkd[1147]: eth0: DHCPv4 address 172.31.28.205/20, gateway 172.31.16.1 acquired from 172.31.16.1 May 27 02:47:49.372219 ignition[1074]: Ignition 2.21.0 May 27 02:47:49.372779 ignition[1074]: Stage: fetch-offline May 27 02:47:49.373224 ignition[1074]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:49.373246 ignition[1074]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 02:47:49.373726 ignition[1074]: Ignition finished successfully May 27 02:47:49.383371 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 02:47:49.389217 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 02:47:49.430134 ignition[1158]: Ignition 2.21.0 May 27 02:47:49.430605 ignition[1158]: Stage: fetch May 27 02:47:49.431130 ignition[1158]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:49.431153 ignition[1158]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 02:47:49.431314 ignition[1158]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 02:47:49.448905 ignition[1158]: PUT result: OK May 27 02:47:49.452225 ignition[1158]: parsed url from cmdline: "" May 27 02:47:49.452365 ignition[1158]: no config URL provided May 27 02:47:49.453754 ignition[1158]: reading system config file "/usr/lib/ignition/user.ign" May 27 02:47:49.453784 ignition[1158]: no config at "/usr/lib/ignition/user.ign" May 27 02:47:49.453824 ignition[1158]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 02:47:49.459688 ignition[1158]: PUT result: OK May 27 02:47:49.459771 ignition[1158]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 May 27 02:47:49.463835 ignition[1158]: GET result: OK May 27 02:47:49.463999 ignition[1158]: parsing config with SHA512: 2b1dee3b3ea56047017bf8269c034e43c5c0aa257bfe0d0cef581ad8cf9ec4eb406080b54370636d1735137ceeb005f450a02e23e2e63d85919d2519eba4b115 May 27 02:47:49.476563 unknown[1158]: fetched base config from "system" May 27 02:47:49.476591 unknown[1158]: fetched base config from "system" May 27 02:47:49.477571 ignition[1158]: fetch: fetch complete May 27 02:47:49.476605 unknown[1158]: fetched user config from "aws" May 27 02:47:49.477584 ignition[1158]: fetch: fetch passed May 27 02:47:49.483303 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 02:47:49.477672 ignition[1158]: Ignition finished successfully May 27 02:47:49.494222 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 02:47:49.549732 ignition[1164]: Ignition 2.21.0 May 27 02:47:49.549771 ignition[1164]: Stage: kargs May 27 02:47:49.551487 ignition[1164]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:49.551586 ignition[1164]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 02:47:49.552695 ignition[1164]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 02:47:49.559172 ignition[1164]: PUT result: OK May 27 02:47:49.563776 ignition[1164]: kargs: kargs passed May 27 02:47:49.563923 ignition[1164]: Ignition finished successfully May 27 02:47:49.571069 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 02:47:49.576693 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 02:47:49.627599 ignition[1170]: Ignition 2.21.0 May 27 02:47:49.628144 ignition[1170]: Stage: disks May 27 02:47:49.628665 ignition[1170]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:49.628689 ignition[1170]: no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 02:47:49.628836 ignition[1170]: PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 02:47:49.633662 ignition[1170]: PUT result: OK May 27 02:47:49.644303 ignition[1170]: disks: disks passed May 27 02:47:49.644597 ignition[1170]: Ignition finished successfully May 27 02:47:49.651089 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 02:47:49.654974 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 02:47:49.659047 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 02:47:49.661426 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 02:47:49.663389 systemd[1]: Reached target sysinit.target - System Initialization. May 27 02:47:49.665378 systemd[1]: Reached target basic.target - Basic System. May 27 02:47:49.671450 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 02:47:49.730446 systemd-fsck[1179]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 27 02:47:49.738058 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 02:47:49.745092 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 02:47:49.902056 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 5656cec4-efbd-4a2d-be98-2263e6ae16bd r/w with ordered data mode. Quota mode: none. May 27 02:47:49.903164 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 02:47:49.906507 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 02:47:49.911783 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 02:47:49.923383 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 02:47:49.925571 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 27 02:47:49.925674 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 02:47:49.925725 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 02:47:49.965092 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1198) May 27 02:47:49.971432 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 02:47:49.978729 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:49.978766 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:49.978792 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 02:47:49.980155 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 02:47:49.999051 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 02:47:50.145595 initrd-setup-root[1222]: cut: /sysroot/etc/passwd: No such file or directory May 27 02:47:50.155088 initrd-setup-root[1229]: cut: /sysroot/etc/group: No such file or directory May 27 02:47:50.163842 initrd-setup-root[1236]: cut: /sysroot/etc/shadow: No such file or directory May 27 02:47:50.172191 initrd-setup-root[1243]: cut: /sysroot/etc/gshadow: No such file or directory May 27 02:47:50.262148 systemd-networkd[1147]: eth0: Gained IPv6LL May 27 02:47:50.358329 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 02:47:50.364237 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 02:47:50.370274 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 02:47:50.394396 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 02:47:50.396872 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:50.428446 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 02:47:50.442285 ignition[1311]: INFO : Ignition 2.21.0 May 27 02:47:50.442285 ignition[1311]: INFO : Stage: mount May 27 02:47:50.445473 ignition[1311]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 02:47:50.445473 ignition[1311]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 02:47:50.445473 ignition[1311]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 02:47:50.452730 ignition[1311]: INFO : PUT result: OK May 27 02:47:50.456786 ignition[1311]: INFO : mount: mount passed May 27 02:47:50.459709 ignition[1311]: INFO : Ignition finished successfully May 27 02:47:50.462087 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 02:47:50.466593 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 02:47:50.906485 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 02:47:50.952054 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 (259:5) scanned by mount (1322) May 27 02:47:50.956081 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:50.956130 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:50.956157 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 02:47:50.979543 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 02:47:51.024388 ignition[1340]: INFO : Ignition 2.21.0 May 27 02:47:51.024388 ignition[1340]: INFO : Stage: files May 27 02:47:51.027639 ignition[1340]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 02:47:51.027639 ignition[1340]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 02:47:51.027639 ignition[1340]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 02:47:51.034956 ignition[1340]: INFO : PUT result: OK May 27 02:47:51.039312 ignition[1340]: DEBUG : files: compiled without relabeling support, skipping May 27 02:47:51.043175 ignition[1340]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 02:47:51.043175 ignition[1340]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 02:47:51.051969 ignition[1340]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 02:47:51.055243 ignition[1340]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 02:47:51.059039 unknown[1340]: wrote ssh authorized keys file for user: core May 27 02:47:51.061561 ignition[1340]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 02:47:51.065560 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 27 02:47:51.073828 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 May 27 02:47:51.170708 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 02:47:51.346998 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 27 02:47:51.351064 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 02:47:51.351064 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 02:47:51.351064 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 02:47:51.361191 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 02:47:51.361191 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 02:47:51.361191 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 02:47:51.371067 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 02:47:51.371067 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 02:47:51.382248 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 02:47:51.386082 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 02:47:51.386082 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:51.396038 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:51.401223 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:51.401223 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 May 27 02:47:52.103277 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 02:47:52.468788 ignition[1340]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:52.468788 ignition[1340]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 02:47:52.475548 ignition[1340]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 02:47:52.483985 ignition[1340]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 02:47:52.483985 ignition[1340]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 02:47:52.483985 ignition[1340]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 02:47:52.492777 ignition[1340]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 02:47:52.492777 ignition[1340]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 02:47:52.492777 ignition[1340]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 02:47:52.492777 ignition[1340]: INFO : files: files passed May 27 02:47:52.492777 ignition[1340]: INFO : Ignition finished successfully May 27 02:47:52.506895 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 02:47:52.513266 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 02:47:52.520280 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 02:47:52.538738 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 02:47:52.541178 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 02:47:52.555790 initrd-setup-root-after-ignition[1369]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 02:47:52.555790 initrd-setup-root-after-ignition[1369]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 02:47:52.564554 initrd-setup-root-after-ignition[1373]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 02:47:52.572115 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 02:47:52.577148 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 02:47:52.584413 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 02:47:52.680168 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 02:47:52.681352 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 02:47:52.687466 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 02:47:52.691795 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 02:47:52.695842 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 02:47:52.698818 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 02:47:52.739773 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 02:47:52.746866 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 02:47:52.784627 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 02:47:52.789574 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 02:47:52.791424 systemd[1]: Stopped target timers.target - Timer Units. May 27 02:47:52.791762 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 02:47:52.791990 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 02:47:52.793033 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 02:47:52.793336 systemd[1]: Stopped target basic.target - Basic System. May 27 02:47:52.793635 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 02:47:52.793943 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 02:47:52.794561 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 02:47:52.794882 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 02:47:52.795499 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 02:47:52.795805 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 02:47:52.796180 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 02:47:52.796443 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 02:47:52.796767 systemd[1]: Stopped target swap.target - Swaps. May 27 02:47:52.797023 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 02:47:52.797237 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 02:47:52.797981 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 02:47:52.798637 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 02:47:52.798870 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 02:47:52.821107 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 02:47:52.823632 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 02:47:52.823853 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 02:47:52.824874 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 02:47:52.825531 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 02:47:52.839226 systemd[1]: ignition-files.service: Deactivated successfully. May 27 02:47:52.839514 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 02:47:52.847255 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 02:47:52.861708 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 02:47:52.863511 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 02:47:52.863761 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 02:47:52.866287 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 02:47:52.866503 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 02:47:52.893219 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 02:47:52.895407 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 02:47:52.947606 ignition[1393]: INFO : Ignition 2.21.0 May 27 02:47:52.950327 ignition[1393]: INFO : Stage: umount May 27 02:47:52.950327 ignition[1393]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 02:47:52.950327 ignition[1393]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" May 27 02:47:52.950327 ignition[1393]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 May 27 02:47:52.959540 ignition[1393]: INFO : PUT result: OK May 27 02:47:52.969834 ignition[1393]: INFO : umount: umount passed May 27 02:47:52.971934 ignition[1393]: INFO : Ignition finished successfully May 27 02:47:52.977318 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 02:47:52.978007 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 02:47:52.986532 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 02:47:52.986988 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 02:47:52.993620 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 02:47:52.993731 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 02:47:52.998984 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 02:47:53.002006 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 02:47:53.008581 systemd[1]: Stopped target network.target - Network. May 27 02:47:53.012026 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 02:47:53.012152 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 02:47:53.016321 systemd[1]: Stopped target paths.target - Path Units. May 27 02:47:53.018391 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 02:47:53.022271 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 02:47:53.025598 systemd[1]: Stopped target slices.target - Slice Units. May 27 02:47:53.027563 systemd[1]: Stopped target sockets.target - Socket Units. May 27 02:47:53.034479 systemd[1]: iscsid.socket: Deactivated successfully. May 27 02:47:53.034556 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 02:47:53.039486 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 02:47:53.039553 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 02:47:53.043251 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 02:47:53.043350 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 02:47:53.047891 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 02:47:53.047974 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 02:47:53.050744 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 02:47:53.054146 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 02:47:53.063001 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 02:47:53.066618 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 02:47:53.066838 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 02:47:53.078866 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 02:47:53.080391 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 02:47:53.083658 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 02:47:53.094772 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 02:47:53.095412 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 02:47:53.095645 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 02:47:53.106553 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 02:47:53.109601 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 02:47:53.109678 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 02:47:53.114804 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 02:47:53.114918 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 02:47:53.121027 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 02:47:53.127548 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 02:47:53.127679 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 02:47:53.137961 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 02:47:53.138087 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 02:47:53.143868 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 02:47:53.143961 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 02:47:53.150279 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 02:47:53.150365 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 02:47:53.159342 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 02:47:53.168736 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 02:47:53.168877 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 02:47:53.194483 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 02:47:53.194764 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 02:47:53.201363 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 02:47:53.201468 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 02:47:53.207871 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 02:47:53.207941 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 02:47:53.213632 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 02:47:53.213743 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 02:47:53.221452 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 02:47:53.221553 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 02:47:53.225504 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 02:47:53.225921 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 02:47:53.233509 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 02:47:53.245825 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 02:47:53.246151 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 02:47:53.255363 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 02:47:53.255467 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 02:47:53.268787 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 02:47:53.268878 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 02:47:53.276839 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 02:47:53.276944 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 02:47:53.286212 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 02:47:53.286329 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:53.299107 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 02:47:53.299214 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 02:47:53.299295 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 02:47:53.299379 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 02:47:53.300236 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 02:47:53.305664 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 02:47:53.308287 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 02:47:53.308468 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 02:47:53.316900 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 02:47:53.328950 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 02:47:53.373351 systemd[1]: Switching root. May 27 02:47:53.413083 systemd-journald[258]: Journal stopped May 27 02:47:55.324753 systemd-journald[258]: Received SIGTERM from PID 1 (systemd). May 27 02:47:55.324876 kernel: SELinux: policy capability network_peer_controls=1 May 27 02:47:55.324925 kernel: SELinux: policy capability open_perms=1 May 27 02:47:55.324955 kernel: SELinux: policy capability extended_socket_class=1 May 27 02:47:55.324990 kernel: SELinux: policy capability always_check_network=0 May 27 02:47:55.325044 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 02:47:55.326079 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 02:47:55.326109 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 02:47:55.326138 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 02:47:55.326168 kernel: SELinux: policy capability userspace_initial_context=0 May 27 02:47:55.326195 kernel: audit: type=1403 audit(1748314073.662:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 02:47:55.326233 systemd[1]: Successfully loaded SELinux policy in 49.641ms. May 27 02:47:55.326308 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 24.123ms. May 27 02:47:55.326346 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 02:47:55.326376 systemd[1]: Detected virtualization amazon. May 27 02:47:55.326406 systemd[1]: Detected architecture arm64. May 27 02:47:55.326436 systemd[1]: Detected first boot. May 27 02:47:55.326468 systemd[1]: Initializing machine ID from VM UUID. May 27 02:47:55.326497 kernel: NET: Registered PF_VSOCK protocol family May 27 02:47:55.326525 zram_generator::config[1440]: No configuration found. May 27 02:47:55.326565 systemd[1]: Populated /etc with preset unit settings. May 27 02:47:55.326601 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 02:47:55.326634 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 02:47:55.326665 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 02:47:55.326696 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 02:47:55.326725 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 02:47:55.326757 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 02:47:55.326807 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 02:47:55.326842 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 02:47:55.326873 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 02:47:55.326910 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 02:47:55.326955 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 02:47:55.326994 systemd[1]: Created slice user.slice - User and Session Slice. May 27 02:47:55.327053 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 02:47:55.327086 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 02:47:55.327118 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 02:47:55.327148 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 02:47:55.327180 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 02:47:55.327218 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 02:47:55.327251 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 02:47:55.327280 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 02:47:55.327311 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 02:47:55.327341 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 02:47:55.327374 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 02:47:55.327416 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 02:47:55.327448 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 02:47:55.327483 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 02:47:55.327516 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 02:47:55.327546 systemd[1]: Reached target slices.target - Slice Units. May 27 02:47:55.327576 systemd[1]: Reached target swap.target - Swaps. May 27 02:47:55.327606 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 02:47:55.327639 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 02:47:55.327668 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 02:47:55.327698 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 02:47:55.327731 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 02:47:55.327776 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 02:47:55.327808 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 02:47:55.327837 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 02:47:55.327866 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 02:47:55.327894 systemd[1]: Mounting media.mount - External Media Directory... May 27 02:47:55.327925 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 02:47:55.327956 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 02:47:55.327985 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 02:47:55.328050 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 02:47:55.328093 systemd[1]: Reached target machines.target - Containers. May 27 02:47:55.328131 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 02:47:55.328161 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 02:47:55.328215 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 02:47:55.328245 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 02:47:55.328274 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 02:47:55.328302 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 02:47:55.328331 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 02:47:55.328366 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 02:47:55.328395 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 02:47:55.328424 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 02:47:55.328453 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 02:47:55.328494 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 02:47:55.328526 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 02:47:55.328558 systemd[1]: Stopped systemd-fsck-usr.service. May 27 02:47:55.328605 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 02:47:55.328639 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 02:47:55.328674 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 02:47:55.328704 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 02:47:55.328734 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 02:47:55.328763 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 02:47:55.328795 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 02:47:55.328836 systemd[1]: verity-setup.service: Deactivated successfully. May 27 02:47:55.328867 systemd[1]: Stopped verity-setup.service. May 27 02:47:55.328896 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 02:47:55.328925 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 02:47:55.328953 systemd[1]: Mounted media.mount - External Media Directory. May 27 02:47:55.328985 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 02:47:55.332119 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 02:47:55.332185 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 02:47:55.332222 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 02:47:55.332252 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 02:47:55.332285 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 02:47:55.332316 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 02:47:55.332345 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 02:47:55.332376 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 02:47:55.332415 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 02:47:55.332470 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 02:47:55.332505 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 02:47:55.332535 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 02:47:55.332568 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 02:47:55.332615 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 02:47:55.332651 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 02:47:55.332683 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 02:47:55.332715 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 02:47:55.332753 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 02:47:55.332784 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 02:47:55.332813 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 02:47:55.332843 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 02:47:55.332875 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 02:47:55.332960 systemd-journald[1523]: Collecting audit messages is disabled. May 27 02:47:55.333037 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 02:47:55.333535 systemd-journald[1523]: Journal started May 27 02:47:55.333587 systemd-journald[1523]: Runtime Journal (/run/log/journal/ec258f6f6144f5b6ab245ebe614e9d55) is 8M, max 75.3M, 67.3M free. May 27 02:47:54.712056 systemd[1]: Queued start job for default target multi-user.target. May 27 02:47:55.349890 kernel: fuse: init (API version 7.41) May 27 02:47:55.349935 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 02:47:54.727289 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 27 02:47:54.728114 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 02:47:55.366151 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 02:47:55.377445 kernel: loop: module loaded May 27 02:47:55.377527 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 02:47:55.383066 systemd[1]: Started systemd-journald.service - Journal Service. May 27 02:47:55.389626 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 02:47:55.392138 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 02:47:55.397098 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 02:47:55.410346 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 02:47:55.410777 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 02:47:55.452573 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 02:47:55.490320 kernel: ACPI: bus type drm_connector registered May 27 02:47:55.493552 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 02:47:55.495963 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 02:47:55.498816 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 02:47:55.499271 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 02:47:55.504928 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 02:47:55.511034 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 02:47:55.525593 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 02:47:55.535407 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 02:47:55.558105 kernel: loop0: detected capacity change from 0 to 61240 May 27 02:47:55.557196 systemd-tmpfiles[1544]: ACLs are not supported, ignoring. May 27 02:47:55.557220 systemd-tmpfiles[1544]: ACLs are not supported, ignoring. May 27 02:47:55.579742 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 02:47:55.601273 systemd-journald[1523]: Time spent on flushing to /var/log/journal/ec258f6f6144f5b6ab245ebe614e9d55 is 59.783ms for 936 entries. May 27 02:47:55.601273 systemd-journald[1523]: System Journal (/var/log/journal/ec258f6f6144f5b6ab245ebe614e9d55) is 8M, max 195.6M, 187.6M free. May 27 02:47:55.678254 systemd-journald[1523]: Received client request to flush runtime journal. May 27 02:47:55.596118 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 02:47:55.604418 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 02:47:55.685197 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 02:47:55.691145 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 02:47:55.720296 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 02:47:55.725050 kernel: loop1: detected capacity change from 0 to 207008 May 27 02:47:55.730700 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 02:47:55.736988 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 02:47:55.750004 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 02:47:55.772270 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 02:47:55.780676 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 02:47:55.821250 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. May 27 02:47:55.821707 systemd-tmpfiles[1593]: ACLs are not supported, ignoring. May 27 02:47:55.833096 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 02:47:55.927083 kernel: loop2: detected capacity change from 0 to 107312 May 27 02:47:56.009055 kernel: loop3: detected capacity change from 0 to 138376 May 27 02:47:56.079564 kernel: loop4: detected capacity change from 0 to 61240 May 27 02:47:56.094057 kernel: loop5: detected capacity change from 0 to 207008 May 27 02:47:56.139244 kernel: loop6: detected capacity change from 0 to 107312 May 27 02:47:56.168060 kernel: loop7: detected capacity change from 0 to 138376 May 27 02:47:56.198320 (sd-merge)[1600]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. May 27 02:47:56.200518 (sd-merge)[1600]: Merged extensions into '/usr'. May 27 02:47:56.215510 systemd[1]: Reload requested from client PID 1555 ('systemd-sysext') (unit systemd-sysext.service)... May 27 02:47:56.215668 systemd[1]: Reloading... May 27 02:47:56.360472 ldconfig[1549]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 02:47:56.415173 zram_generator::config[1630]: No configuration found. May 27 02:47:56.634437 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:47:56.826161 systemd[1]: Reloading finished in 608 ms. May 27 02:47:56.851821 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 02:47:56.854758 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 02:47:56.857674 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 02:47:56.874209 systemd[1]: Starting ensure-sysext.service... May 27 02:47:56.879337 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 02:47:56.885374 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 02:47:56.920156 systemd[1]: Reload requested from client PID 1680 ('systemctl') (unit ensure-sysext.service)... May 27 02:47:56.920186 systemd[1]: Reloading... May 27 02:47:56.958370 systemd-tmpfiles[1681]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 02:47:56.959091 systemd-tmpfiles[1681]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 02:47:56.960005 systemd-tmpfiles[1681]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 02:47:56.960773 systemd-tmpfiles[1681]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 02:47:56.963380 systemd-tmpfiles[1681]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 02:47:56.965080 systemd-tmpfiles[1681]: ACLs are not supported, ignoring. May 27 02:47:56.965352 systemd-tmpfiles[1681]: ACLs are not supported, ignoring. May 27 02:47:56.984054 systemd-tmpfiles[1681]: Detected autofs mount point /boot during canonicalization of boot. May 27 02:47:56.984077 systemd-tmpfiles[1681]: Skipping /boot May 27 02:47:57.005027 systemd-udevd[1682]: Using default interface naming scheme 'v255'. May 27 02:47:57.030966 systemd-tmpfiles[1681]: Detected autofs mount point /boot during canonicalization of boot. May 27 02:47:57.030996 systemd-tmpfiles[1681]: Skipping /boot May 27 02:47:57.125808 zram_generator::config[1712]: No configuration found. May 27 02:47:57.428227 (udev-worker)[1738]: Network interface NamePolicy= disabled on kernel command line. May 27 02:47:57.476632 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:47:57.717827 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 02:47:57.719210 systemd[1]: Reloading finished in 798 ms. May 27 02:47:57.742701 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 02:47:57.766909 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 02:47:57.810835 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 02:47:57.816639 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 02:47:57.822837 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 02:47:57.833422 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 02:47:57.841046 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 02:47:57.888264 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 02:47:57.903562 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 02:47:57.910938 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 02:47:57.917046 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 02:47:57.924186 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 02:47:57.930873 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 02:47:57.934273 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 02:47:57.934545 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 02:47:57.942215 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 02:47:57.942560 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 02:47:57.942749 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 02:47:57.953615 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 02:47:57.960546 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 02:47:57.963208 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 02:47:57.963454 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 02:47:57.963771 systemd[1]: Reached target time-set.target - System Time Set. May 27 02:47:57.974823 systemd[1]: Finished ensure-sysext.service. May 27 02:47:57.978138 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 02:47:57.991217 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 02:47:58.022439 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 02:47:58.039147 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 02:47:58.058933 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 02:47:58.061821 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 02:47:58.105648 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 02:47:58.106116 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 02:47:58.118679 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 02:47:58.120405 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 02:47:58.123586 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 02:47:58.124969 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 02:47:58.136289 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 02:47:58.136594 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 02:47:58.169890 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 02:47:58.170435 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 02:47:58.199765 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:58.220103 augenrules[1905]: No rules May 27 02:47:58.224816 systemd[1]: audit-rules.service: Deactivated successfully. May 27 02:47:58.234000 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 02:47:58.436219 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. May 27 02:47:58.450095 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:58.456367 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 02:47:58.475517 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 02:47:58.500190 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 02:47:58.602649 systemd-networkd[1823]: lo: Link UP May 27 02:47:58.603127 systemd-networkd[1823]: lo: Gained carrier May 27 02:47:58.606086 systemd-networkd[1823]: Enumeration completed May 27 02:47:58.606413 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 02:47:58.608943 systemd-networkd[1823]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:58.609110 systemd-networkd[1823]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 02:47:58.611196 systemd-networkd[1823]: eth0: Link UP May 27 02:47:58.611744 systemd-networkd[1823]: eth0: Gained carrier May 27 02:47:58.611904 systemd-networkd[1823]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:58.612710 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 02:47:58.620351 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 02:47:58.625204 systemd-networkd[1823]: eth0: DHCPv4 address 172.31.28.205/20, gateway 172.31.16.1 acquired from 172.31.16.1 May 27 02:47:58.632454 systemd-resolved[1826]: Positive Trust Anchors: May 27 02:47:58.632493 systemd-resolved[1826]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 02:47:58.632577 systemd-resolved[1826]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 02:47:58.646456 systemd-resolved[1826]: Defaulting to hostname 'linux'. May 27 02:47:58.652079 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 02:47:58.655814 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 02:47:58.659421 systemd[1]: Reached target network.target - Network. May 27 02:47:58.661489 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 02:47:58.663978 systemd[1]: Reached target sysinit.target - System Initialization. May 27 02:47:58.666422 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 02:47:58.668981 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 02:47:58.671769 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 02:47:58.674077 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 02:47:58.676449 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 02:47:58.678907 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 02:47:58.679099 systemd[1]: Reached target paths.target - Path Units. May 27 02:47:58.680924 systemd[1]: Reached target timers.target - Timer Units. May 27 02:47:58.684459 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 02:47:58.689982 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 02:47:58.696721 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 02:47:58.699769 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 02:47:58.702450 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 02:47:58.716183 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 02:47:58.719077 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 02:47:58.722646 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 02:47:58.725116 systemd[1]: Reached target sockets.target - Socket Units. May 27 02:47:58.727189 systemd[1]: Reached target basic.target - Basic System. May 27 02:47:58.729433 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 02:47:58.729482 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 02:47:58.731598 systemd[1]: Starting containerd.service - containerd container runtime... May 27 02:47:58.738287 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 02:47:58.749645 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 02:47:58.755428 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 02:47:58.761189 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 02:47:58.770514 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 02:47:58.772578 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 02:47:58.778481 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 02:47:58.787937 systemd[1]: Started ntpd.service - Network Time Service. May 27 02:47:58.805661 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 02:47:58.813258 systemd[1]: Starting setup-oem.service - Setup OEM... May 27 02:47:58.823495 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 02:47:58.829751 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 02:47:58.848044 jq[1968]: false May 27 02:47:58.844768 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 02:47:58.849133 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 02:47:58.850036 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 02:47:58.855285 systemd[1]: Starting update-engine.service - Update Engine... May 27 02:47:58.866333 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 02:47:58.902108 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 02:47:58.905231 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 02:47:58.905679 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 02:47:58.921413 ntpd[1971]: ntpd 4.2.8p17@1.4004-o Tue May 27 00:38:41 UTC 2025 (1): Starting May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: ntpd 4.2.8p17@1.4004-o Tue May 27 00:38:41 UTC 2025 (1): Starting May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: ---------------------------------------------------- May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: ntp-4 is maintained by Network Time Foundation, May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Inc. (NTF), a non-profit 501(c)(3) public-benefit May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: corporation. Support and training for ntp-4 are May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: available at https://www.nwtime.org/support May 27 02:47:58.924821 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: ---------------------------------------------------- May 27 02:47:58.921475 ntpd[1971]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp May 27 02:47:58.938525 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: proto: precision = 0.096 usec (-23) May 27 02:47:58.938578 extend-filesystems[1969]: Found loop4 May 27 02:47:58.938578 extend-filesystems[1969]: Found loop5 May 27 02:47:58.938578 extend-filesystems[1969]: Found loop6 May 27 02:47:58.938578 extend-filesystems[1969]: Found loop7 May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1 May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1p1 May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1p2 May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1p3 May 27 02:47:58.938578 extend-filesystems[1969]: Found usr May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1p4 May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1p6 May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1p7 May 27 02:47:58.938578 extend-filesystems[1969]: Found nvme0n1p9 May 27 02:47:58.938578 extend-filesystems[1969]: Checking size of /dev/nvme0n1p9 May 27 02:47:58.921494 ntpd[1971]: ---------------------------------------------------- May 27 02:47:58.973305 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 02:47:59.009076 extend-filesystems[1969]: Resized partition /dev/nvme0n1p9 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: basedate set to 2025-05-15 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: gps base set to 2025-05-18 (week 2367) May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Listen and drop on 0 v6wildcard [::]:123 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Listen and drop on 1 v4wildcard 0.0.0.0:123 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Listen normally on 2 lo 127.0.0.1:123 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Listen normally on 3 eth0 172.31.28.205:123 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Listen normally on 4 lo [::1]:123 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: bind(21) AF_INET6 fe80::40f:74ff:fe6e:4281%2#123 flags 0x11 failed: Cannot assign requested address May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: unable to create socket on eth0 (5) for fe80::40f:74ff:fe6e:4281%2#123 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: failed to init interface for address fe80::40f:74ff:fe6e:4281%2 May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: Listening on routing socket on fd #21 for interface updates May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 02:47:59.010925 ntpd[1971]: 27 May 02:47:58 ntpd[1971]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 02:47:58.921512 ntpd[1971]: ntp-4 is maintained by Network Time Foundation, May 27 02:47:58.976157 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 02:47:59.011990 extend-filesystems[2005]: resize2fs 1.47.2 (1-Jan-2025) May 27 02:47:59.021349 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks May 27 02:47:58.921529 ntpd[1971]: Inc. (NTF), a non-profit 501(c)(3) public-benefit May 27 02:47:58.921545 ntpd[1971]: corporation. Support and training for ntp-4 are May 27 02:47:58.921563 ntpd[1971]: available at https://www.nwtime.org/support May 27 02:47:58.921579 ntpd[1971]: ---------------------------------------------------- May 27 02:47:58.936746 ntpd[1971]: proto: precision = 0.096 usec (-23) May 27 02:47:58.941325 ntpd[1971]: basedate set to 2025-05-15 May 27 02:47:58.941360 ntpd[1971]: gps base set to 2025-05-18 (week 2367) May 27 02:47:58.949544 ntpd[1971]: Listen and drop on 0 v6wildcard [::]:123 May 27 02:47:58.949621 ntpd[1971]: Listen and drop on 1 v4wildcard 0.0.0.0:123 May 27 02:47:58.949901 ntpd[1971]: Listen normally on 2 lo 127.0.0.1:123 May 27 02:47:58.949968 ntpd[1971]: Listen normally on 3 eth0 172.31.28.205:123 May 27 02:47:58.950061 ntpd[1971]: Listen normally on 4 lo [::1]:123 May 27 02:47:58.950138 ntpd[1971]: bind(21) AF_INET6 fe80::40f:74ff:fe6e:4281%2#123 flags 0x11 failed: Cannot assign requested address May 27 02:47:58.950173 ntpd[1971]: unable to create socket on eth0 (5) for fe80::40f:74ff:fe6e:4281%2#123 May 27 02:47:58.950198 ntpd[1971]: failed to init interface for address fe80::40f:74ff:fe6e:4281%2 May 27 02:47:58.950250 ntpd[1971]: Listening on routing socket on fd #21 for interface updates May 27 02:47:58.971085 ntpd[1971]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 02:47:58.971137 ntpd[1971]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized May 27 02:47:59.024947 tar[1986]: linux-arm64/LICENSE May 27 02:47:59.027703 tar[1986]: linux-arm64/helm May 27 02:47:59.062081 jq[1980]: true May 27 02:47:59.137546 update_engine[1979]: I20250527 02:47:59.107146 1979 main.cc:92] Flatcar Update Engine starting May 27 02:47:59.137546 update_engine[1979]: I20250527 02:47:59.122097 1979 update_check_scheduler.cc:74] Next update check in 7m20s May 27 02:47:59.086262 dbus-daemon[1966]: [system] SELinux support is enabled May 27 02:47:59.079765 systemd[1]: motdgen.service: Deactivated successfully. May 27 02:47:59.147745 jq[2015]: true May 27 02:47:59.157155 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 May 27 02:47:59.108471 dbus-daemon[1966]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1823 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") May 27 02:47:59.080629 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 02:47:59.086588 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 02:47:59.093473 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 02:47:59.157973 extend-filesystems[2005]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required May 27 02:47:59.157973 extend-filesystems[2005]: old_desc_blocks = 1, new_desc_blocks = 1 May 27 02:47:59.157973 extend-filesystems[2005]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. May 27 02:47:59.093517 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 02:47:59.171454 extend-filesystems[1969]: Resized filesystem in /dev/nvme0n1p9 May 27 02:47:59.096076 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 02:47:59.096110 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 02:47:59.120740 (ntainerd)[2012]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 02:47:59.138689 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... May 27 02:47:59.141070 systemd[1]: Started update-engine.service - Update Engine. May 27 02:47:59.223199 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 02:47:59.226235 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 02:47:59.241939 coreos-metadata[1965]: May 27 02:47:59.241 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 May 27 02:47:59.241939 coreos-metadata[1965]: May 27 02:47:59.241 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 May 27 02:47:59.241939 coreos-metadata[1965]: May 27 02:47:59.241 INFO Fetch successful May 27 02:47:59.241939 coreos-metadata[1965]: May 27 02:47:59.241 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 May 27 02:47:59.241939 coreos-metadata[1965]: May 27 02:47:59.241 INFO Fetch successful May 27 02:47:59.241939 coreos-metadata[1965]: May 27 02:47:59.241 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 May 27 02:47:59.254136 coreos-metadata[1965]: May 27 02:47:59.249 INFO Fetch successful May 27 02:47:59.254136 coreos-metadata[1965]: May 27 02:47:59.249 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 May 27 02:47:59.254136 coreos-metadata[1965]: May 27 02:47:59.249 INFO Fetch successful May 27 02:47:59.254136 coreos-metadata[1965]: May 27 02:47:59.249 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 May 27 02:47:59.254136 coreos-metadata[1965]: May 27 02:47:59.249 INFO Fetch failed with 404: resource not found May 27 02:47:59.254136 coreos-metadata[1965]: May 27 02:47:59.249 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 May 27 02:47:59.245266 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetch successful May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetch successful May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetch successful May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetch successful May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 May 27 02:47:59.257451 coreos-metadata[1965]: May 27 02:47:59.257 INFO Fetch successful May 27 02:47:59.444812 systemd[1]: Finished setup-oem.service - Setup OEM. May 27 02:47:59.486255 bash[2074]: Updated "/home/core/.ssh/authorized_keys" May 27 02:47:59.497763 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 02:47:59.510165 systemd[1]: Starting sshkeys.service... May 27 02:47:59.533619 systemd-logind[1978]: Watching system buttons on /dev/input/event0 (Power Button) May 27 02:47:59.533672 systemd-logind[1978]: Watching system buttons on /dev/input/event1 (Sleep Button) May 27 02:47:59.545130 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 02:47:59.547963 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 02:47:59.549142 systemd-logind[1978]: New seat seat0. May 27 02:47:59.556464 systemd[1]: Started systemd-logind.service - User Login Management. May 27 02:47:59.635911 locksmithd[2024]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 02:47:59.674826 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 02:47:59.684159 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 02:47:59.797958 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 02:47:59.862269 systemd-networkd[1823]: eth0: Gained IPv6LL May 27 02:47:59.874526 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 02:47:59.878189 systemd[1]: Reached target network-online.target - Network is Online. May 27 02:47:59.888918 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. May 27 02:47:59.904551 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:47:59.917824 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 02:47:59.960080 containerd[2012]: time="2025-05-27T02:47:59Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 02:47:59.969836 containerd[2012]: time="2025-05-27T02:47:59.969752160Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 02:48:00.069106 containerd[2012]: time="2025-05-27T02:48:00.067078437Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="16.584µs" May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.073438593Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.073519761Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.073807785Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.073845525Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.073897653Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.074025837Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.074054961Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.074448525Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.074477061Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.074513397Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.074537349Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 02:48:00.075047 containerd[2012]: time="2025-05-27T02:48:00.074694069Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 02:48:00.087744 containerd[2012]: time="2025-05-27T02:48:00.087663597Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 02:48:00.087873 containerd[2012]: time="2025-05-27T02:48:00.087792477Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 02:48:00.087873 containerd[2012]: time="2025-05-27T02:48:00.087820941Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 02:48:00.087962 containerd[2012]: time="2025-05-27T02:48:00.087901533Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 02:48:00.089692 containerd[2012]: time="2025-05-27T02:48:00.089628729Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 02:48:00.089862 containerd[2012]: time="2025-05-27T02:48:00.089832537Z" level=info msg="metadata content store policy set" policy=shared May 27 02:48:00.121107 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 02:48:00.139560 containerd[2012]: time="2025-05-27T02:48:00.139480029Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 02:48:00.139690 containerd[2012]: time="2025-05-27T02:48:00.139596429Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 02:48:00.139690 containerd[2012]: time="2025-05-27T02:48:00.139650777Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 02:48:00.139690 containerd[2012]: time="2025-05-27T02:48:00.139680897Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 02:48:00.139862 containerd[2012]: time="2025-05-27T02:48:00.139714065Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 02:48:00.139862 containerd[2012]: time="2025-05-27T02:48:00.139741449Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 02:48:00.139862 containerd[2012]: time="2025-05-27T02:48:00.139769949Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 02:48:00.139862 containerd[2012]: time="2025-05-27T02:48:00.139799157Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 02:48:00.139862 containerd[2012]: time="2025-05-27T02:48:00.139828209Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 02:48:00.139862 containerd[2012]: time="2025-05-27T02:48:00.139856421Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 02:48:00.140133 containerd[2012]: time="2025-05-27T02:48:00.139880337Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 02:48:00.140133 containerd[2012]: time="2025-05-27T02:48:00.139910769Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 02:48:00.140218 containerd[2012]: time="2025-05-27T02:48:00.140162169Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 02:48:00.140218 containerd[2012]: time="2025-05-27T02:48:00.140205141Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 02:48:00.140299 containerd[2012]: time="2025-05-27T02:48:00.140237493Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 02:48:00.140299 containerd[2012]: time="2025-05-27T02:48:00.140264121Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 02:48:00.140379 containerd[2012]: time="2025-05-27T02:48:00.140292333Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 02:48:00.140379 containerd[2012]: time="2025-05-27T02:48:00.140327625Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 02:48:00.140379 containerd[2012]: time="2025-05-27T02:48:00.140355525Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 02:48:00.140515 containerd[2012]: time="2025-05-27T02:48:00.140381313Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 02:48:00.140515 containerd[2012]: time="2025-05-27T02:48:00.140410149Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 02:48:00.140515 containerd[2012]: time="2025-05-27T02:48:00.140442861Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 02:48:00.140515 containerd[2012]: time="2025-05-27T02:48:00.140471145Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 02:48:00.140726 containerd[2012]: time="2025-05-27T02:48:00.140630853Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 02:48:00.140726 containerd[2012]: time="2025-05-27T02:48:00.140663241Z" level=info msg="Start snapshots syncer" May 27 02:48:00.140726 containerd[2012]: time="2025-05-27T02:48:00.140708925Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 02:48:00.148054 containerd[2012]: time="2025-05-27T02:48:00.144311217Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 02:48:00.148054 containerd[2012]: time="2025-05-27T02:48:00.144432129Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.144602505Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.144834069Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.144877233Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.144905025Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.144933333Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.144962505Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.144989253Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.148121769Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.148220757Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.148251441Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.148280877Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 02:48:00.148376 containerd[2012]: time="2025-05-27T02:48:00.148375161Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148412949Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148435677Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148460841Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148484085Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148508817Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148536153Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148606593Z" level=info msg="runtime interface created" May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148623969Z" level=info msg="created NRI interface" May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148646061Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148676001Z" level=info msg="Connect containerd service" May 27 02:48:00.148973 containerd[2012]: time="2025-05-27T02:48:00.148741089Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 02:48:00.154290 containerd[2012]: time="2025-05-27T02:48:00.150119793Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 02:48:00.209050 coreos-metadata[2122]: May 27 02:48:00.204 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 May 27 02:48:00.205554 systemd[1]: Started systemd-hostnamed.service - Hostname Service. May 27 02:48:00.212104 amazon-ssm-agent[2138]: Initializing new seelog logger May 27 02:48:00.212104 amazon-ssm-agent[2138]: New Seelog Logger Creation Complete May 27 02:48:00.212104 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.212104 amazon-ssm-agent[2138]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.212104 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 processing appconfig overrides May 27 02:48:00.216708 dbus-daemon[1966]: [system] Successfully activated service 'org.freedesktop.hostname1' May 27 02:48:00.218307 coreos-metadata[2122]: May 27 02:48:00.217 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 May 27 02:48:00.221888 coreos-metadata[2122]: May 27 02:48:00.220 INFO Fetch successful May 27 02:48:00.221888 coreos-metadata[2122]: May 27 02:48:00.220 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 May 27 02:48:00.221296 dbus-daemon[1966]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=2023 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") May 27 02:48:00.222192 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.222192 amazon-ssm-agent[2138]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.222192 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 processing appconfig overrides May 27 02:48:00.222192 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.222192 amazon-ssm-agent[2138]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.222192 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 processing appconfig overrides May 27 02:48:00.222192 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.2203 INFO Proxy environment variables: May 27 02:48:00.229440 coreos-metadata[2122]: May 27 02:48:00.227 INFO Fetch successful May 27 02:48:00.234111 unknown[2122]: wrote ssh authorized keys file for user: core May 27 02:48:00.234362 systemd[1]: Starting polkit.service - Authorization Manager... May 27 02:48:00.250945 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.250945 amazon-ssm-agent[2138]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:00.250945 amazon-ssm-agent[2138]: 2025/05/27 02:48:00 processing appconfig overrides May 27 02:48:00.311412 update-ssh-keys[2179]: Updated "/home/core/.ssh/authorized_keys" May 27 02:48:00.315050 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 02:48:00.321699 systemd[1]: Finished sshkeys.service. May 27 02:48:00.329954 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.2204 INFO https_proxy: May 27 02:48:00.450088 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.2204 INFO http_proxy: May 27 02:48:00.550050 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.2204 INFO no_proxy: May 27 02:48:00.614613 containerd[2012]: time="2025-05-27T02:48:00.614435448Z" level=info msg="Start subscribing containerd event" May 27 02:48:00.614838 containerd[2012]: time="2025-05-27T02:48:00.614795508Z" level=info msg="Start recovering state" May 27 02:48:00.616238 containerd[2012]: time="2025-05-27T02:48:00.616202412Z" level=info msg="Start event monitor" May 27 02:48:00.616381 containerd[2012]: time="2025-05-27T02:48:00.616356468Z" level=info msg="Start cni network conf syncer for default" May 27 02:48:00.616514 containerd[2012]: time="2025-05-27T02:48:00.616488444Z" level=info msg="Start streaming server" May 27 02:48:00.616646 containerd[2012]: time="2025-05-27T02:48:00.616623276Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 02:48:00.616758 containerd[2012]: time="2025-05-27T02:48:00.616734564Z" level=info msg="runtime interface starting up..." May 27 02:48:00.616866 containerd[2012]: time="2025-05-27T02:48:00.616843404Z" level=info msg="starting plugins..." May 27 02:48:00.617365 containerd[2012]: time="2025-05-27T02:48:00.616996020Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 02:48:00.617872 containerd[2012]: time="2025-05-27T02:48:00.616400556Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 02:48:00.618922 containerd[2012]: time="2025-05-27T02:48:00.618591468Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 02:48:00.618922 containerd[2012]: time="2025-05-27T02:48:00.618765024Z" level=info msg="containerd successfully booted in 0.662051s" May 27 02:48:00.618861 systemd[1]: Started containerd.service - containerd container runtime. May 27 02:48:00.656142 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.2206 INFO Checking if agent identity type OnPrem can be assumed May 27 02:48:00.755688 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.2207 INFO Checking if agent identity type EC2 can be assumed May 27 02:48:00.816614 polkitd[2175]: Started polkitd version 126 May 27 02:48:00.842243 polkitd[2175]: Loading rules from directory /etc/polkit-1/rules.d May 27 02:48:00.846005 polkitd[2175]: Loading rules from directory /run/polkit-1/rules.d May 27 02:48:00.848414 polkitd[2175]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) May 27 02:48:00.851479 polkitd[2175]: Loading rules from directory /usr/local/share/polkit-1/rules.d May 27 02:48:00.853456 polkitd[2175]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) May 27 02:48:00.853551 polkitd[2175]: Loading rules from directory /usr/share/polkit-1/rules.d May 27 02:48:00.854995 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4360 INFO Agent will take identity from EC2 May 27 02:48:00.859102 polkitd[2175]: Finished loading, compiling and executing 2 rules May 27 02:48:00.860154 systemd[1]: Started polkit.service - Authorization Manager. May 27 02:48:00.864059 dbus-daemon[1966]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' May 27 02:48:00.867625 polkitd[2175]: Acquired the name org.freedesktop.PolicyKit1 on the system bus May 27 02:48:00.923956 systemd-hostnamed[2023]: Hostname set to (transient) May 27 02:48:00.925079 systemd-resolved[1826]: System hostname changed to 'ip-172-31-28-205'. May 27 02:48:00.954766 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4378 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.3.0.0 May 27 02:48:01.053909 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4378 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 May 27 02:48:01.137759 sshd_keygen[2007]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 02:48:01.153981 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4378 INFO [amazon-ssm-agent] Starting Core Agent May 27 02:48:01.244900 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 02:48:01.253237 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4378 INFO [amazon-ssm-agent] Registrar detected. Attempting registration May 27 02:48:01.256623 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 02:48:01.261561 systemd[1]: Started sshd@0-172.31.28.205:22-139.178.68.195:56966.service - OpenSSH per-connection server daemon (139.178.68.195:56966). May 27 02:48:01.322900 systemd[1]: issuegen.service: Deactivated successfully. May 27 02:48:01.323419 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 02:48:01.333436 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 02:48:01.354287 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4378 INFO [Registrar] Starting registrar module May 27 02:48:01.386494 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 02:48:01.395557 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 02:48:01.402530 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 02:48:01.406921 systemd[1]: Reached target getty.target - Login Prompts. May 27 02:48:01.454123 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4398 INFO [EC2Identity] Checking disk for registration info May 27 02:48:01.553561 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4399 INFO [EC2Identity] No registration info found for ec2 instance, attempting registration May 27 02:48:01.555150 tar[1986]: linux-arm64/README.md May 27 02:48:01.574395 sshd[2212]: Accepted publickey for core from 139.178.68.195 port 56966 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:48:01.584218 sshd-session[2212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:01.590884 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 02:48:01.612934 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 02:48:01.617502 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 02:48:01.653457 systemd-logind[1978]: New session 1 of user core. May 27 02:48:01.658782 amazon-ssm-agent[2138]: 2025-05-27 02:48:00.4399 INFO [EC2Identity] Generating registration keypair May 27 02:48:01.674778 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 02:48:01.686472 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 02:48:01.711344 (systemd)[2226]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 02:48:01.721386 systemd-logind[1978]: New session c1 of user core. May 27 02:48:01.922146 ntpd[1971]: Listen normally on 6 eth0 [fe80::40f:74ff:fe6e:4281%2]:123 May 27 02:48:01.923086 ntpd[1971]: 27 May 02:48:01 ntpd[1971]: Listen normally on 6 eth0 [fe80::40f:74ff:fe6e:4281%2]:123 May 27 02:48:01.930771 amazon-ssm-agent[2138]: 2025-05-27 02:48:01.9305 INFO [EC2Identity] Checking write access before registering May 27 02:48:01.985750 amazon-ssm-agent[2138]: 2025/05/27 02:48:01 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:01.985750 amazon-ssm-agent[2138]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. May 27 02:48:01.985750 amazon-ssm-agent[2138]: 2025/05/27 02:48:01 processing appconfig overrides May 27 02:48:02.020112 amazon-ssm-agent[2138]: 2025-05-27 02:48:01.9332 INFO [EC2Identity] Registering EC2 instance with Systems Manager May 27 02:48:02.020112 amazon-ssm-agent[2138]: 2025-05-27 02:48:01.9837 INFO [EC2Identity] EC2 registration was successful. May 27 02:48:02.020112 amazon-ssm-agent[2138]: 2025-05-27 02:48:01.9848 INFO [amazon-ssm-agent] Registration attempted. Resuming core agent startup. May 27 02:48:02.020112 amazon-ssm-agent[2138]: 2025-05-27 02:48:01.9850 INFO [CredentialRefresher] credentialRefresher has started May 27 02:48:02.020112 amazon-ssm-agent[2138]: 2025-05-27 02:48:01.9850 INFO [CredentialRefresher] Starting credentials refresher loop May 27 02:48:02.020112 amazon-ssm-agent[2138]: 2025-05-27 02:48:02.0179 INFO EC2RoleProvider Successfully connected with instance profile role credentials May 27 02:48:02.020112 amazon-ssm-agent[2138]: 2025-05-27 02:48:02.0196 INFO [CredentialRefresher] Credentials ready May 27 02:48:02.031597 amazon-ssm-agent[2138]: 2025-05-27 02:48:02.0199 INFO [CredentialRefresher] Next credential rotation will be in 29.9999672803 minutes May 27 02:48:02.060387 systemd[2226]: Queued start job for default target default.target. May 27 02:48:02.068752 systemd[2226]: Created slice app.slice - User Application Slice. May 27 02:48:02.068816 systemd[2226]: Reached target paths.target - Paths. May 27 02:48:02.068904 systemd[2226]: Reached target timers.target - Timers. May 27 02:48:02.073184 systemd[2226]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 02:48:02.092644 systemd[2226]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 02:48:02.092772 systemd[2226]: Reached target sockets.target - Sockets. May 27 02:48:02.092856 systemd[2226]: Reached target basic.target - Basic System. May 27 02:48:02.092938 systemd[2226]: Reached target default.target - Main User Target. May 27 02:48:02.092998 systemd[2226]: Startup finished in 350ms. May 27 02:48:02.094202 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 02:48:02.102328 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 02:48:02.266588 systemd[1]: Started sshd@1-172.31.28.205:22-139.178.68.195:56974.service - OpenSSH per-connection server daemon (139.178.68.195:56974). May 27 02:48:02.399194 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:02.402497 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 02:48:02.405542 systemd[1]: Startup finished in 3.802s (kernel) + 8.989s (initrd) + 8.791s (userspace) = 21.583s. May 27 02:48:02.424242 (kubelet)[2245]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 02:48:02.493736 sshd[2238]: Accepted publickey for core from 139.178.68.195 port 56974 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:48:02.496875 sshd-session[2238]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:02.508777 systemd-logind[1978]: New session 2 of user core. May 27 02:48:02.514326 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 02:48:02.645308 sshd[2250]: Connection closed by 139.178.68.195 port 56974 May 27 02:48:02.644291 sshd-session[2238]: pam_unix(sshd:session): session closed for user core May 27 02:48:02.652496 systemd[1]: sshd@1-172.31.28.205:22-139.178.68.195:56974.service: Deactivated successfully. May 27 02:48:02.655866 systemd[1]: session-2.scope: Deactivated successfully. May 27 02:48:02.657833 systemd-logind[1978]: Session 2 logged out. Waiting for processes to exit. May 27 02:48:02.662260 systemd-logind[1978]: Removed session 2. May 27 02:48:02.677441 systemd[1]: Started sshd@2-172.31.28.205:22-139.178.68.195:56986.service - OpenSSH per-connection server daemon (139.178.68.195:56986). May 27 02:48:02.912844 sshd[2260]: Accepted publickey for core from 139.178.68.195 port 56986 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:48:02.915192 sshd-session[2260]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:02.924703 systemd-logind[1978]: New session 3 of user core. May 27 02:48:02.936308 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 02:48:03.054994 sshd[2262]: Connection closed by 139.178.68.195 port 56986 May 27 02:48:03.055755 sshd-session[2260]: pam_unix(sshd:session): session closed for user core May 27 02:48:03.057465 amazon-ssm-agent[2138]: 2025-05-27 02:48:03.0571 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process May 27 02:48:03.065762 systemd[1]: sshd@2-172.31.28.205:22-139.178.68.195:56986.service: Deactivated successfully. May 27 02:48:03.067117 systemd-logind[1978]: Session 3 logged out. Waiting for processes to exit. May 27 02:48:03.071886 systemd[1]: session-3.scope: Deactivated successfully. May 27 02:48:03.096152 systemd-logind[1978]: Removed session 3. May 27 02:48:03.096585 systemd[1]: Started sshd@3-172.31.28.205:22-139.178.68.195:56992.service - OpenSSH per-connection server daemon (139.178.68.195:56992). May 27 02:48:03.160041 amazon-ssm-agent[2138]: 2025-05-27 02:48:03.0688 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2266) started May 27 02:48:03.260332 amazon-ssm-agent[2138]: 2025-05-27 02:48:03.0702 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds May 27 02:48:03.307656 sshd[2271]: Accepted publickey for core from 139.178.68.195 port 56992 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:48:03.310809 sshd-session[2271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:03.322103 systemd-logind[1978]: New session 4 of user core. May 27 02:48:03.328321 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 02:48:03.456140 kubelet[2245]: E0527 02:48:03.456079 2245 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 02:48:03.460107 sshd[2280]: Connection closed by 139.178.68.195 port 56992 May 27 02:48:03.459637 sshd-session[2271]: pam_unix(sshd:session): session closed for user core May 27 02:48:03.461604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 02:48:03.463313 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 02:48:03.468654 systemd[1]: kubelet.service: Consumed 1.419s CPU time, 258M memory peak. May 27 02:48:03.475813 systemd[1]: sshd@3-172.31.28.205:22-139.178.68.195:56992.service: Deactivated successfully. May 27 02:48:03.479950 systemd[1]: session-4.scope: Deactivated successfully. May 27 02:48:03.503349 systemd-logind[1978]: Session 4 logged out. Waiting for processes to exit. May 27 02:48:03.505335 systemd[1]: Started sshd@4-172.31.28.205:22-139.178.68.195:57008.service - OpenSSH per-connection server daemon (139.178.68.195:57008). May 27 02:48:03.507941 systemd-logind[1978]: Removed session 4. May 27 02:48:03.699675 sshd[2290]: Accepted publickey for core from 139.178.68.195 port 57008 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:48:03.702150 sshd-session[2290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:03.709947 systemd-logind[1978]: New session 5 of user core. May 27 02:48:03.721260 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 02:48:03.834943 sudo[2293]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 02:48:03.836180 sudo[2293]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:03.854507 sudo[2293]: pam_unix(sudo:session): session closed for user root May 27 02:48:03.877056 sshd[2292]: Connection closed by 139.178.68.195 port 57008 May 27 02:48:03.877283 sshd-session[2290]: pam_unix(sshd:session): session closed for user core May 27 02:48:03.884448 systemd[1]: sshd@4-172.31.28.205:22-139.178.68.195:57008.service: Deactivated successfully. May 27 02:48:03.887974 systemd[1]: session-5.scope: Deactivated successfully. May 27 02:48:03.892126 systemd-logind[1978]: Session 5 logged out. Waiting for processes to exit. May 27 02:48:03.894992 systemd-logind[1978]: Removed session 5. May 27 02:48:03.918433 systemd[1]: Started sshd@5-172.31.28.205:22-139.178.68.195:46266.service - OpenSSH per-connection server daemon (139.178.68.195:46266). May 27 02:48:04.119258 sshd[2299]: Accepted publickey for core from 139.178.68.195 port 46266 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:48:04.121748 sshd-session[2299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:04.129588 systemd-logind[1978]: New session 6 of user core. May 27 02:48:04.149299 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 02:48:04.253952 sudo[2303]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 02:48:04.254595 sudo[2303]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:04.263292 sudo[2303]: pam_unix(sudo:session): session closed for user root May 27 02:48:04.272885 sudo[2302]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 02:48:04.273982 sudo[2302]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:04.290646 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 02:48:04.360920 augenrules[2325]: No rules May 27 02:48:04.363802 systemd[1]: audit-rules.service: Deactivated successfully. May 27 02:48:04.365181 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 02:48:04.368183 sudo[2302]: pam_unix(sudo:session): session closed for user root May 27 02:48:04.392501 sshd[2301]: Connection closed by 139.178.68.195 port 46266 May 27 02:48:04.393476 sshd-session[2299]: pam_unix(sshd:session): session closed for user core May 27 02:48:04.400612 systemd-logind[1978]: Session 6 logged out. Waiting for processes to exit. May 27 02:48:04.401860 systemd[1]: sshd@5-172.31.28.205:22-139.178.68.195:46266.service: Deactivated successfully. May 27 02:48:04.404688 systemd[1]: session-6.scope: Deactivated successfully. May 27 02:48:04.412132 systemd-logind[1978]: Removed session 6. May 27 02:48:04.426956 systemd[1]: Started sshd@6-172.31.28.205:22-139.178.68.195:46272.service - OpenSSH per-connection server daemon (139.178.68.195:46272). May 27 02:48:04.634371 sshd[2334]: Accepted publickey for core from 139.178.68.195 port 46272 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:48:04.637288 sshd-session[2334]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:04.646637 systemd-logind[1978]: New session 7 of user core. May 27 02:48:04.653297 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 02:48:04.755242 sudo[2337]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 02:48:04.755819 sudo[2337]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:05.258767 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 02:48:05.289516 (dockerd)[2355]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 02:48:05.657208 dockerd[2355]: time="2025-05-27T02:48:05.655428413Z" level=info msg="Starting up" May 27 02:48:05.661848 dockerd[2355]: time="2025-05-27T02:48:05.661755857Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 02:48:05.725875 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1949227-merged.mount: Deactivated successfully. May 27 02:48:05.857368 dockerd[2355]: time="2025-05-27T02:48:05.856968258Z" level=info msg="Loading containers: start." May 27 02:48:05.874271 kernel: Initializing XFRM netlink socket May 27 02:48:05.579715 systemd-resolved[1826]: Clock change detected. Flushing caches. May 27 02:48:05.587589 systemd-journald[1523]: Time jumped backwards, rotating. May 27 02:48:05.840834 (udev-worker)[2381]: Network interface NamePolicy= disabled on kernel command line. May 27 02:48:05.915191 systemd-networkd[1823]: docker0: Link UP May 27 02:48:05.919828 dockerd[2355]: time="2025-05-27T02:48:05.919740433Z" level=info msg="Loading containers: done." May 27 02:48:05.945813 dockerd[2355]: time="2025-05-27T02:48:05.945493609Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 02:48:05.945813 dockerd[2355]: time="2025-05-27T02:48:05.945620401Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 02:48:05.946071 dockerd[2355]: time="2025-05-27T02:48:05.946045549Z" level=info msg="Initializing buildkit" May 27 02:48:05.987970 dockerd[2355]: time="2025-05-27T02:48:05.987910538Z" level=info msg="Completed buildkit initialization" May 27 02:48:06.005453 dockerd[2355]: time="2025-05-27T02:48:06.005361010Z" level=info msg="Daemon has completed initialization" May 27 02:48:06.006824 dockerd[2355]: time="2025-05-27T02:48:06.005628214Z" level=info msg="API listen on /run/docker.sock" May 27 02:48:06.007129 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 02:48:06.372637 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4100597521-merged.mount: Deactivated successfully. May 27 02:48:07.119550 containerd[2012]: time="2025-05-27T02:48:07.119478623Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\"" May 27 02:48:07.731059 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount457449795.mount: Deactivated successfully. May 27 02:48:08.969710 containerd[2012]: time="2025-05-27T02:48:08.968172604Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:08.970227 containerd[2012]: time="2025-05-27T02:48:08.970057576Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.5: active requests=0, bytes read=26326311" May 27 02:48:08.970606 containerd[2012]: time="2025-05-27T02:48:08.970568068Z" level=info msg="ImageCreate event name:\"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:08.975327 containerd[2012]: time="2025-05-27T02:48:08.975277024Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:08.977267 containerd[2012]: time="2025-05-27T02:48:08.977196892Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.5\" with image id \"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\", size \"26323111\" in 1.857653805s" May 27 02:48:08.977371 containerd[2012]: time="2025-05-27T02:48:08.977269432Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\" returns image reference \"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\"" May 27 02:48:08.978030 containerd[2012]: time="2025-05-27T02:48:08.977980636Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\"" May 27 02:48:10.305644 containerd[2012]: time="2025-05-27T02:48:10.305565435Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:10.308719 containerd[2012]: time="2025-05-27T02:48:10.308665491Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.5: active requests=0, bytes read=22530547" May 27 02:48:10.309647 containerd[2012]: time="2025-05-27T02:48:10.309607227Z" level=info msg="ImageCreate event name:\"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:10.315612 containerd[2012]: time="2025-05-27T02:48:10.315485451Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:10.317123 containerd[2012]: time="2025-05-27T02:48:10.316916667Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.5\" with image id \"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\", size \"24066313\" in 1.338879091s" May 27 02:48:10.317123 containerd[2012]: time="2025-05-27T02:48:10.316966695Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\" returns image reference \"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\"" May 27 02:48:10.318018 containerd[2012]: time="2025-05-27T02:48:10.317888511Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\"" May 27 02:48:11.478888 containerd[2012]: time="2025-05-27T02:48:11.478813805Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:11.481784 containerd[2012]: time="2025-05-27T02:48:11.481708853Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.5: active requests=0, bytes read=17484190" May 27 02:48:11.483934 containerd[2012]: time="2025-05-27T02:48:11.483864497Z" level=info msg="ImageCreate event name:\"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:11.487514 containerd[2012]: time="2025-05-27T02:48:11.487468445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:11.489603 containerd[2012]: time="2025-05-27T02:48:11.489409445Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.5\" with image id \"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\", size \"19019974\" in 1.17144825s" May 27 02:48:11.489603 containerd[2012]: time="2025-05-27T02:48:11.489462533Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\" returns image reference \"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\"" May 27 02:48:11.490372 containerd[2012]: time="2025-05-27T02:48:11.490012205Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\"" May 27 02:48:12.790697 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2022261907.mount: Deactivated successfully. May 27 02:48:13.324582 containerd[2012]: time="2025-05-27T02:48:13.324498954Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:13.326378 containerd[2012]: time="2025-05-27T02:48:13.326312994Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.5: active requests=0, bytes read=27377375" May 27 02:48:13.327988 containerd[2012]: time="2025-05-27T02:48:13.327907458Z" level=info msg="ImageCreate event name:\"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:13.331499 containerd[2012]: time="2025-05-27T02:48:13.331423278Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:13.332690 containerd[2012]: time="2025-05-27T02:48:13.332496462Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.5\" with image id \"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\", repo tag \"registry.k8s.io/kube-proxy:v1.32.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\", size \"27376394\" in 1.842435657s" May 27 02:48:13.332690 containerd[2012]: time="2025-05-27T02:48:13.332551374Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\" returns image reference \"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\"" May 27 02:48:13.333425 containerd[2012]: time="2025-05-27T02:48:13.333098886Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 02:48:13.369261 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 02:48:13.371928 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:13.724291 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:13.738259 (kubelet)[2641]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 02:48:13.827609 kubelet[2641]: E0527 02:48:13.827502 2641 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 02:48:13.836636 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 02:48:13.838069 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 02:48:13.839097 systemd[1]: kubelet.service: Consumed 314ms CPU time, 105.9M memory peak. May 27 02:48:13.875743 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3030810524.mount: Deactivated successfully. May 27 02:48:15.098095 containerd[2012]: time="2025-05-27T02:48:15.098037055Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:15.100368 containerd[2012]: time="2025-05-27T02:48:15.100323967Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" May 27 02:48:15.102169 containerd[2012]: time="2025-05-27T02:48:15.102128731Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:15.108002 containerd[2012]: time="2025-05-27T02:48:15.107945623Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:15.114602 containerd[2012]: time="2025-05-27T02:48:15.114508783Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.781358393s" May 27 02:48:15.114602 containerd[2012]: time="2025-05-27T02:48:15.114599719Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" May 27 02:48:15.116510 containerd[2012]: time="2025-05-27T02:48:15.116204827Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 02:48:15.637748 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3650450608.mount: Deactivated successfully. May 27 02:48:15.652264 containerd[2012]: time="2025-05-27T02:48:15.652183486Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 02:48:15.653784 containerd[2012]: time="2025-05-27T02:48:15.653688634Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" May 27 02:48:15.656744 containerd[2012]: time="2025-05-27T02:48:15.656667862Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 02:48:15.662813 containerd[2012]: time="2025-05-27T02:48:15.662690986Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 02:48:15.664798 containerd[2012]: time="2025-05-27T02:48:15.664139086Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 547.609323ms" May 27 02:48:15.664798 containerd[2012]: time="2025-05-27T02:48:15.664195006Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 27 02:48:15.665061 containerd[2012]: time="2025-05-27T02:48:15.665014486Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 27 02:48:16.240143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1914552989.mount: Deactivated successfully. May 27 02:48:18.268743 containerd[2012]: time="2025-05-27T02:48:18.268663031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:18.271300 containerd[2012]: time="2025-05-27T02:48:18.271231907Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812469" May 27 02:48:18.274085 containerd[2012]: time="2025-05-27T02:48:18.274011515Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:18.279733 containerd[2012]: time="2025-05-27T02:48:18.279653915Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:18.281941 containerd[2012]: time="2025-05-27T02:48:18.281718575Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.616651385s" May 27 02:48:18.281941 containerd[2012]: time="2025-05-27T02:48:18.281795027Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" May 27 02:48:23.993214 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 02:48:23.996416 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:24.028314 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 02:48:24.028493 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 02:48:24.030825 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:24.035490 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:24.085855 systemd[1]: Reload requested from client PID 2789 ('systemctl') (unit session-7.scope)... May 27 02:48:24.086039 systemd[1]: Reloading... May 27 02:48:24.327821 zram_generator::config[2837]: No configuration found. May 27 02:48:24.525000 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:48:24.787346 systemd[1]: Reloading finished in 700 ms. May 27 02:48:24.887849 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 02:48:24.888187 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 02:48:24.888851 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:24.889072 systemd[1]: kubelet.service: Consumed 218ms CPU time, 94.9M memory peak. May 27 02:48:24.892467 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:25.231294 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:25.247338 (kubelet)[2897]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 02:48:25.321780 kubelet[2897]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:25.322219 kubelet[2897]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 02:48:25.322219 kubelet[2897]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:25.322219 kubelet[2897]: I0527 02:48:25.322005 2897 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 02:48:26.908897 kubelet[2897]: I0527 02:48:26.908829 2897 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 02:48:26.909848 kubelet[2897]: I0527 02:48:26.909502 2897 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 02:48:26.910226 kubelet[2897]: I0527 02:48:26.910181 2897 server.go:954] "Client rotation is on, will bootstrap in background" May 27 02:48:26.962761 kubelet[2897]: E0527 02:48:26.962692 2897 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.28.205:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.28.205:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:26.965003 kubelet[2897]: I0527 02:48:26.964807 2897 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 02:48:26.978526 kubelet[2897]: I0527 02:48:26.978431 2897 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 02:48:26.991128 kubelet[2897]: I0527 02:48:26.990564 2897 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 02:48:26.992161 kubelet[2897]: I0527 02:48:26.992086 2897 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 02:48:26.992453 kubelet[2897]: I0527 02:48:26.992158 2897 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-28-205","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 02:48:26.992616 kubelet[2897]: I0527 02:48:26.992489 2897 topology_manager.go:138] "Creating topology manager with none policy" May 27 02:48:26.992616 kubelet[2897]: I0527 02:48:26.992509 2897 container_manager_linux.go:304] "Creating device plugin manager" May 27 02:48:26.992770 kubelet[2897]: I0527 02:48:26.992730 2897 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:26.998503 kubelet[2897]: I0527 02:48:26.998434 2897 kubelet.go:446] "Attempting to sync node with API server" May 27 02:48:26.999093 kubelet[2897]: I0527 02:48:26.998648 2897 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 02:48:26.999093 kubelet[2897]: I0527 02:48:26.998698 2897 kubelet.go:352] "Adding apiserver pod source" May 27 02:48:26.999093 kubelet[2897]: I0527 02:48:26.998723 2897 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 02:48:27.005383 kubelet[2897]: W0527 02:48:27.005288 2897 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.28.205:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-28-205&limit=500&resourceVersion=0": dial tcp 172.31.28.205:6443: connect: connection refused May 27 02:48:27.005535 kubelet[2897]: E0527 02:48:27.005396 2897 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.28.205:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-28-205&limit=500&resourceVersion=0\": dial tcp 172.31.28.205:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:27.006718 kubelet[2897]: I0527 02:48:27.006668 2897 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 02:48:27.007527 kubelet[2897]: I0527 02:48:27.007484 2897 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 02:48:27.008449 kubelet[2897]: W0527 02:48:27.007622 2897 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 02:48:27.008842 kubelet[2897]: I0527 02:48:27.008808 2897 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 02:48:27.008926 kubelet[2897]: I0527 02:48:27.008870 2897 server.go:1287] "Started kubelet" May 27 02:48:27.008992 kubelet[2897]: W0527 02:48:27.008940 2897 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.28.205:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.28.205:6443: connect: connection refused May 27 02:48:27.009094 kubelet[2897]: E0527 02:48:27.009023 2897 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.28.205:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.28.205:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:27.017827 kubelet[2897]: E0527 02:48:27.017285 2897 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.28.205:6443/api/v1/namespaces/default/events\": dial tcp 172.31.28.205:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-28-205.1843426ad810e80e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-28-205,UID:ip-172-31-28-205,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-28-205,},FirstTimestamp:2025-05-27 02:48:27.008837646 +0000 UTC m=+1.755754198,LastTimestamp:2025-05-27 02:48:27.008837646 +0000 UTC m=+1.755754198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-28-205,}" May 27 02:48:27.018649 kubelet[2897]: I0527 02:48:27.018551 2897 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 02:48:27.021799 kubelet[2897]: I0527 02:48:27.019859 2897 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 02:48:27.021799 kubelet[2897]: I0527 02:48:27.020303 2897 server.go:479] "Adding debug handlers to kubelet server" May 27 02:48:27.022312 kubelet[2897]: I0527 02:48:27.022237 2897 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 02:48:27.022724 kubelet[2897]: I0527 02:48:27.022695 2897 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 02:48:27.027065 kubelet[2897]: I0527 02:48:27.027027 2897 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 02:48:27.031676 kubelet[2897]: E0527 02:48:27.031637 2897 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-28-205\" not found" May 27 02:48:27.032003 kubelet[2897]: I0527 02:48:27.031980 2897 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 02:48:27.032635 kubelet[2897]: I0527 02:48:27.032516 2897 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 02:48:27.032847 kubelet[2897]: I0527 02:48:27.032828 2897 reconciler.go:26] "Reconciler: start to sync state" May 27 02:48:27.034037 kubelet[2897]: E0527 02:48:27.033893 2897 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.205:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-205?timeout=10s\": dial tcp 172.31.28.205:6443: connect: connection refused" interval="200ms" May 27 02:48:27.034361 kubelet[2897]: W0527 02:48:27.034302 2897 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.28.205:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.28.205:6443: connect: connection refused May 27 02:48:27.034577 kubelet[2897]: E0527 02:48:27.034516 2897 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.28.205:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.28.205:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:27.035595 kubelet[2897]: I0527 02:48:27.035558 2897 factory.go:221] Registration of the systemd container factory successfully May 27 02:48:27.036091 kubelet[2897]: I0527 02:48:27.036001 2897 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 02:48:27.037917 kubelet[2897]: E0527 02:48:27.037883 2897 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 02:48:27.038316 kubelet[2897]: I0527 02:48:27.038291 2897 factory.go:221] Registration of the containerd container factory successfully May 27 02:48:27.053021 kubelet[2897]: I0527 02:48:27.052945 2897 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 02:48:27.055174 kubelet[2897]: I0527 02:48:27.055116 2897 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 02:48:27.055321 kubelet[2897]: I0527 02:48:27.055192 2897 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 02:48:27.055321 kubelet[2897]: I0527 02:48:27.055229 2897 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 02:48:27.055321 kubelet[2897]: I0527 02:48:27.055244 2897 kubelet.go:2382] "Starting kubelet main sync loop" May 27 02:48:27.055456 kubelet[2897]: E0527 02:48:27.055310 2897 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 02:48:27.071588 kubelet[2897]: W0527 02:48:27.071277 2897 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.28.205:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.28.205:6443: connect: connection refused May 27 02:48:27.071588 kubelet[2897]: E0527 02:48:27.071377 2897 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.28.205:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.28.205:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:27.081302 kubelet[2897]: I0527 02:48:27.081269 2897 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 02:48:27.082013 kubelet[2897]: I0527 02:48:27.081548 2897 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 02:48:27.082013 kubelet[2897]: I0527 02:48:27.081581 2897 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:27.085413 kubelet[2897]: I0527 02:48:27.085385 2897 policy_none.go:49] "None policy: Start" May 27 02:48:27.085563 kubelet[2897]: I0527 02:48:27.085544 2897 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 02:48:27.085657 kubelet[2897]: I0527 02:48:27.085641 2897 state_mem.go:35] "Initializing new in-memory state store" May 27 02:48:27.096157 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 02:48:27.117372 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 02:48:27.132674 kubelet[2897]: E0527 02:48:27.132627 2897 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-28-205\" not found" May 27 02:48:27.139471 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 02:48:27.143219 kubelet[2897]: I0527 02:48:27.142455 2897 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 02:48:27.143219 kubelet[2897]: I0527 02:48:27.142738 2897 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 02:48:27.143219 kubelet[2897]: I0527 02:48:27.142757 2897 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 02:48:27.143219 kubelet[2897]: I0527 02:48:27.143139 2897 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 02:48:27.147830 kubelet[2897]: E0527 02:48:27.147796 2897 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 02:48:27.148046 kubelet[2897]: E0527 02:48:27.147994 2897 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-28-205\" not found" May 27 02:48:27.180334 systemd[1]: Created slice kubepods-burstable-podabfd19643ff8189c1addc853bacdf831.slice - libcontainer container kubepods-burstable-podabfd19643ff8189c1addc853bacdf831.slice. May 27 02:48:27.193682 kubelet[2897]: E0527 02:48:27.193318 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:27.198401 systemd[1]: Created slice kubepods-burstable-pode0aaf5f0c9e97457ee767475b70bb115.slice - libcontainer container kubepods-burstable-pode0aaf5f0c9e97457ee767475b70bb115.slice. May 27 02:48:27.208642 kubelet[2897]: E0527 02:48:27.208596 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:27.214989 systemd[1]: Created slice kubepods-burstable-pod09870cd5e8f27cf098ce0a841fe2b08a.slice - libcontainer container kubepods-burstable-pod09870cd5e8f27cf098ce0a841fe2b08a.slice. May 27 02:48:27.219796 kubelet[2897]: E0527 02:48:27.219724 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:27.234851 kubelet[2897]: I0527 02:48:27.234750 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/abfd19643ff8189c1addc853bacdf831-ca-certs\") pod \"kube-apiserver-ip-172-31-28-205\" (UID: \"abfd19643ff8189c1addc853bacdf831\") " pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:27.234851 kubelet[2897]: I0527 02:48:27.234839 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:27.235003 kubelet[2897]: I0527 02:48:27.234883 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/09870cd5e8f27cf098ce0a841fe2b08a-kubeconfig\") pod \"kube-scheduler-ip-172-31-28-205\" (UID: \"09870cd5e8f27cf098ce0a841fe2b08a\") " pod="kube-system/kube-scheduler-ip-172-31-28-205" May 27 02:48:27.235003 kubelet[2897]: I0527 02:48:27.234922 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:27.235003 kubelet[2897]: I0527 02:48:27.234962 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/abfd19643ff8189c1addc853bacdf831-k8s-certs\") pod \"kube-apiserver-ip-172-31-28-205\" (UID: \"abfd19643ff8189c1addc853bacdf831\") " pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:27.235003 kubelet[2897]: I0527 02:48:27.234998 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/abfd19643ff8189c1addc853bacdf831-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-28-205\" (UID: \"abfd19643ff8189c1addc853bacdf831\") " pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:27.235214 kubelet[2897]: I0527 02:48:27.235032 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-ca-certs\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:27.235214 kubelet[2897]: I0527 02:48:27.235066 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-k8s-certs\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:27.235214 kubelet[2897]: I0527 02:48:27.235106 2897 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-kubeconfig\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:27.235846 kubelet[2897]: E0527 02:48:27.235801 2897 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.205:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-205?timeout=10s\": dial tcp 172.31.28.205:6443: connect: connection refused" interval="400ms" May 27 02:48:27.246056 kubelet[2897]: I0527 02:48:27.246006 2897 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-28-205" May 27 02:48:27.246581 kubelet[2897]: E0527 02:48:27.246535 2897 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.28.205:6443/api/v1/nodes\": dial tcp 172.31.28.205:6443: connect: connection refused" node="ip-172-31-28-205" May 27 02:48:27.449658 kubelet[2897]: I0527 02:48:27.449531 2897 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-28-205" May 27 02:48:27.450364 kubelet[2897]: E0527 02:48:27.450059 2897 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.28.205:6443/api/v1/nodes\": dial tcp 172.31.28.205:6443: connect: connection refused" node="ip-172-31-28-205" May 27 02:48:27.495843 containerd[2012]: time="2025-05-27T02:48:27.495470000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-28-205,Uid:abfd19643ff8189c1addc853bacdf831,Namespace:kube-system,Attempt:0,}" May 27 02:48:27.510649 containerd[2012]: time="2025-05-27T02:48:27.510337856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-28-205,Uid:e0aaf5f0c9e97457ee767475b70bb115,Namespace:kube-system,Attempt:0,}" May 27 02:48:27.522283 containerd[2012]: time="2025-05-27T02:48:27.522214280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-28-205,Uid:09870cd5e8f27cf098ce0a841fe2b08a,Namespace:kube-system,Attempt:0,}" May 27 02:48:27.559243 containerd[2012]: time="2025-05-27T02:48:27.559046913Z" level=info msg="connecting to shim d1869da27d30876a5ce41bfae42e09e653f47f5adfd692c47788ca8e6176a3b7" address="unix:///run/containerd/s/27958709c7e7befc82fb06ef87538e83a5801e2b43e8db586b5ea1a08c8c87da" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:27.602983 containerd[2012]: time="2025-05-27T02:48:27.602442405Z" level=info msg="connecting to shim 9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb" address="unix:///run/containerd/s/cb3a11e1cb7d7551724d9472b218c3ba88a0ee1a5a28aa0d1bf79e1b4bd84483" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:27.636888 kubelet[2897]: E0527 02:48:27.636839 2897 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.205:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-205?timeout=10s\": dial tcp 172.31.28.205:6443: connect: connection refused" interval="800ms" May 27 02:48:27.664296 systemd[1]: Started cri-containerd-d1869da27d30876a5ce41bfae42e09e653f47f5adfd692c47788ca8e6176a3b7.scope - libcontainer container d1869da27d30876a5ce41bfae42e09e653f47f5adfd692c47788ca8e6176a3b7. May 27 02:48:27.698407 containerd[2012]: time="2025-05-27T02:48:27.698337969Z" level=info msg="connecting to shim bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d" address="unix:///run/containerd/s/65cd7e68e4433789da97c612f867eea37d17e3b9ccf917229870c2175b45eeaa" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:27.701368 systemd[1]: Started cri-containerd-9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb.scope - libcontainer container 9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb. May 27 02:48:27.756107 systemd[1]: Started cri-containerd-bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d.scope - libcontainer container bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d. May 27 02:48:27.824800 containerd[2012]: time="2025-05-27T02:48:27.824360482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-28-205,Uid:abfd19643ff8189c1addc853bacdf831,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1869da27d30876a5ce41bfae42e09e653f47f5adfd692c47788ca8e6176a3b7\"" May 27 02:48:27.842044 containerd[2012]: time="2025-05-27T02:48:27.841977250Z" level=info msg="CreateContainer within sandbox \"d1869da27d30876a5ce41bfae42e09e653f47f5adfd692c47788ca8e6176a3b7\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 02:48:27.854794 kubelet[2897]: I0527 02:48:27.854084 2897 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-28-205" May 27 02:48:27.854794 kubelet[2897]: E0527 02:48:27.854590 2897 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://172.31.28.205:6443/api/v1/nodes\": dial tcp 172.31.28.205:6443: connect: connection refused" node="ip-172-31-28-205" May 27 02:48:27.882127 containerd[2012]: time="2025-05-27T02:48:27.881525458Z" level=info msg="Container be0cf02e06ae195368516432314abd1165b2dec1e50b3b8f1e2ca5e853b199f2: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:27.882434 containerd[2012]: time="2025-05-27T02:48:27.882215206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-28-205,Uid:e0aaf5f0c9e97457ee767475b70bb115,Namespace:kube-system,Attempt:0,} returns sandbox id \"9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb\"" May 27 02:48:27.886670 containerd[2012]: time="2025-05-27T02:48:27.885437578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-28-205,Uid:09870cd5e8f27cf098ce0a841fe2b08a,Namespace:kube-system,Attempt:0,} returns sandbox id \"bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d\"" May 27 02:48:27.888190 containerd[2012]: time="2025-05-27T02:48:27.888128386Z" level=info msg="CreateContainer within sandbox \"9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 02:48:27.892014 containerd[2012]: time="2025-05-27T02:48:27.891956986Z" level=info msg="CreateContainer within sandbox \"bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 02:48:27.901074 containerd[2012]: time="2025-05-27T02:48:27.900931918Z" level=info msg="CreateContainer within sandbox \"d1869da27d30876a5ce41bfae42e09e653f47f5adfd692c47788ca8e6176a3b7\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"be0cf02e06ae195368516432314abd1165b2dec1e50b3b8f1e2ca5e853b199f2\"" May 27 02:48:27.902846 containerd[2012]: time="2025-05-27T02:48:27.902802718Z" level=info msg="StartContainer for \"be0cf02e06ae195368516432314abd1165b2dec1e50b3b8f1e2ca5e853b199f2\"" May 27 02:48:27.905223 containerd[2012]: time="2025-05-27T02:48:27.905170954Z" level=info msg="connecting to shim be0cf02e06ae195368516432314abd1165b2dec1e50b3b8f1e2ca5e853b199f2" address="unix:///run/containerd/s/27958709c7e7befc82fb06ef87538e83a5801e2b43e8db586b5ea1a08c8c87da" protocol=ttrpc version=3 May 27 02:48:27.919088 containerd[2012]: time="2025-05-27T02:48:27.919012474Z" level=info msg="Container 378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:27.927504 containerd[2012]: time="2025-05-27T02:48:27.927232931Z" level=info msg="Container 184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:27.939362 containerd[2012]: time="2025-05-27T02:48:27.939292187Z" level=info msg="CreateContainer within sandbox \"9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb\"" May 27 02:48:27.941977 containerd[2012]: time="2025-05-27T02:48:27.940998767Z" level=info msg="StartContainer for \"378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb\"" May 27 02:48:27.950573 containerd[2012]: time="2025-05-27T02:48:27.950522987Z" level=info msg="connecting to shim 378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb" address="unix:///run/containerd/s/cb3a11e1cb7d7551724d9472b218c3ba88a0ee1a5a28aa0d1bf79e1b4bd84483" protocol=ttrpc version=3 May 27 02:48:27.952385 systemd[1]: Started cri-containerd-be0cf02e06ae195368516432314abd1165b2dec1e50b3b8f1e2ca5e853b199f2.scope - libcontainer container be0cf02e06ae195368516432314abd1165b2dec1e50b3b8f1e2ca5e853b199f2. May 27 02:48:27.956947 containerd[2012]: time="2025-05-27T02:48:27.955015283Z" level=info msg="CreateContainer within sandbox \"bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf\"" May 27 02:48:27.956947 containerd[2012]: time="2025-05-27T02:48:27.956035811Z" level=info msg="StartContainer for \"184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf\"" May 27 02:48:27.960463 containerd[2012]: time="2025-05-27T02:48:27.960342803Z" level=info msg="connecting to shim 184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf" address="unix:///run/containerd/s/65cd7e68e4433789da97c612f867eea37d17e3b9ccf917229870c2175b45eeaa" protocol=ttrpc version=3 May 27 02:48:28.019014 systemd[1]: Started cri-containerd-378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb.scope - libcontainer container 378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb. May 27 02:48:28.031144 systemd[1]: Started cri-containerd-184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf.scope - libcontainer container 184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf. May 27 02:48:28.118949 containerd[2012]: time="2025-05-27T02:48:28.118865587Z" level=info msg="StartContainer for \"be0cf02e06ae195368516432314abd1165b2dec1e50b3b8f1e2ca5e853b199f2\" returns successfully" May 27 02:48:28.205510 containerd[2012]: time="2025-05-27T02:48:28.204484700Z" level=info msg="StartContainer for \"378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb\" returns successfully" May 27 02:48:28.214346 containerd[2012]: time="2025-05-27T02:48:28.214286768Z" level=info msg="StartContainer for \"184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf\" returns successfully" May 27 02:48:28.658733 kubelet[2897]: I0527 02:48:28.658688 2897 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-28-205" May 27 02:48:29.109349 kubelet[2897]: E0527 02:48:29.109040 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:29.114745 kubelet[2897]: E0527 02:48:29.114698 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:29.121791 kubelet[2897]: E0527 02:48:29.121629 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:30.125799 kubelet[2897]: E0527 02:48:30.125730 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:30.128553 kubelet[2897]: E0527 02:48:30.128481 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:30.128973 kubelet[2897]: E0527 02:48:30.128936 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:30.617417 systemd[1]: systemd-hostnamed.service: Deactivated successfully. May 27 02:48:31.126636 kubelet[2897]: E0527 02:48:31.126582 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:31.127187 kubelet[2897]: E0527 02:48:31.127135 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:31.130476 kubelet[2897]: E0527 02:48:31.130430 2897 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:31.473885 kubelet[2897]: E0527 02:48:31.473355 2897 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-28-205\" not found" node="ip-172-31-28-205" May 27 02:48:31.551708 kubelet[2897]: I0527 02:48:31.551390 2897 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-28-205" May 27 02:48:31.634484 kubelet[2897]: I0527 02:48:31.634412 2897 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-28-205" May 27 02:48:31.675965 kubelet[2897]: E0527 02:48:31.675903 2897 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-28-205\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-28-205" May 27 02:48:31.675965 kubelet[2897]: I0527 02:48:31.675952 2897 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:31.683353 kubelet[2897]: E0527 02:48:31.683299 2897 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-28-205\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:31.683353 kubelet[2897]: I0527 02:48:31.683346 2897 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:31.706063 kubelet[2897]: E0527 02:48:31.705969 2897 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-28-205\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:32.009716 kubelet[2897]: I0527 02:48:32.009585 2897 apiserver.go:52] "Watching apiserver" May 27 02:48:32.033787 kubelet[2897]: I0527 02:48:32.033656 2897 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 02:48:32.526978 kubelet[2897]: I0527 02:48:32.526525 2897 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:33.723804 systemd[1]: Reload requested from client PID 3177 ('systemctl') (unit session-7.scope)... May 27 02:48:33.723848 systemd[1]: Reloading... May 27 02:48:33.934836 zram_generator::config[3230]: No configuration found. May 27 02:48:34.105997 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:48:34.433946 systemd[1]: Reloading finished in 709 ms. May 27 02:48:34.489975 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:34.510139 systemd[1]: kubelet.service: Deactivated successfully. May 27 02:48:34.510749 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:34.510926 systemd[1]: kubelet.service: Consumed 2.443s CPU time, 125.9M memory peak. May 27 02:48:34.518346 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:34.904712 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:34.927186 (kubelet)[3281]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 02:48:35.021944 kubelet[3281]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:35.021944 kubelet[3281]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 02:48:35.021944 kubelet[3281]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:35.022453 kubelet[3281]: I0527 02:48:35.022058 3281 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 02:48:35.047395 kubelet[3281]: I0527 02:48:35.047302 3281 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 02:48:35.047395 kubelet[3281]: I0527 02:48:35.047356 3281 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 02:48:35.048410 kubelet[3281]: I0527 02:48:35.048359 3281 server.go:954] "Client rotation is on, will bootstrap in background" May 27 02:48:35.053930 kubelet[3281]: I0527 02:48:35.053382 3281 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 02:48:35.064265 kubelet[3281]: I0527 02:48:35.063428 3281 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 02:48:35.077404 kubelet[3281]: I0527 02:48:35.077351 3281 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 02:48:35.082579 kubelet[3281]: I0527 02:48:35.082516 3281 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 02:48:35.082994 kubelet[3281]: I0527 02:48:35.082932 3281 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 02:48:35.083287 kubelet[3281]: I0527 02:48:35.082989 3281 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-28-205","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 02:48:35.084483 kubelet[3281]: I0527 02:48:35.083301 3281 topology_manager.go:138] "Creating topology manager with none policy" May 27 02:48:35.084483 kubelet[3281]: I0527 02:48:35.083321 3281 container_manager_linux.go:304] "Creating device plugin manager" May 27 02:48:35.084483 kubelet[3281]: I0527 02:48:35.083399 3281 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:35.084483 kubelet[3281]: I0527 02:48:35.083694 3281 kubelet.go:446] "Attempting to sync node with API server" May 27 02:48:35.084877 kubelet[3281]: I0527 02:48:35.084547 3281 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 02:48:35.084877 kubelet[3281]: I0527 02:48:35.084605 3281 kubelet.go:352] "Adding apiserver pod source" May 27 02:48:35.084877 kubelet[3281]: I0527 02:48:35.084638 3281 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 02:48:35.088728 kubelet[3281]: I0527 02:48:35.088561 3281 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 02:48:35.091073 kubelet[3281]: I0527 02:48:35.089653 3281 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 02:48:35.091330 kubelet[3281]: I0527 02:48:35.091304 3281 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 02:48:35.091472 kubelet[3281]: I0527 02:48:35.091455 3281 server.go:1287] "Started kubelet" May 27 02:48:35.098885 kubelet[3281]: I0527 02:48:35.098846 3281 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 02:48:35.108432 kubelet[3281]: I0527 02:48:35.108359 3281 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 02:48:35.130103 kubelet[3281]: I0527 02:48:35.130061 3281 server.go:479] "Adding debug handlers to kubelet server" May 27 02:48:35.134504 kubelet[3281]: I0527 02:48:35.134395 3281 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 02:48:35.147707 kubelet[3281]: I0527 02:48:35.147246 3281 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 02:48:35.147707 kubelet[3281]: I0527 02:48:35.143090 3281 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 02:48:35.154422 kubelet[3281]: E0527 02:48:35.154360 3281 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ip-172-31-28-205\" not found" May 27 02:48:35.174358 kubelet[3281]: I0527 02:48:35.137450 3281 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 02:48:35.174358 kubelet[3281]: I0527 02:48:35.143115 3281 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 02:48:35.175190 kubelet[3281]: I0527 02:48:35.174480 3281 reconciler.go:26] "Reconciler: start to sync state" May 27 02:48:35.180184 kubelet[3281]: I0527 02:48:35.179526 3281 factory.go:221] Registration of the systemd container factory successfully May 27 02:48:35.180184 kubelet[3281]: I0527 02:48:35.179682 3281 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 02:48:35.197752 kubelet[3281]: I0527 02:48:35.197609 3281 factory.go:221] Registration of the containerd container factory successfully May 27 02:48:35.220710 kubelet[3281]: E0527 02:48:35.220362 3281 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 02:48:35.233803 kubelet[3281]: I0527 02:48:35.232941 3281 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 02:48:35.252204 kubelet[3281]: I0527 02:48:35.251677 3281 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 02:48:35.253645 kubelet[3281]: I0527 02:48:35.253175 3281 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 02:48:35.254066 kubelet[3281]: I0527 02:48:35.253914 3281 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 02:48:35.254066 kubelet[3281]: I0527 02:48:35.254004 3281 kubelet.go:2382] "Starting kubelet main sync loop" May 27 02:48:35.257864 kubelet[3281]: E0527 02:48:35.257363 3281 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 02:48:35.361216 kubelet[3281]: I0527 02:48:35.360484 3281 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 02:48:35.361216 kubelet[3281]: I0527 02:48:35.360516 3281 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 02:48:35.361216 kubelet[3281]: I0527 02:48:35.360549 3281 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:35.362419 kubelet[3281]: I0527 02:48:35.361935 3281 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 02:48:35.362419 kubelet[3281]: I0527 02:48:35.361971 3281 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 02:48:35.362419 kubelet[3281]: I0527 02:48:35.362010 3281 policy_none.go:49] "None policy: Start" May 27 02:48:35.362419 kubelet[3281]: I0527 02:48:35.362029 3281 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 02:48:35.362419 kubelet[3281]: I0527 02:48:35.362054 3281 state_mem.go:35] "Initializing new in-memory state store" May 27 02:48:35.362419 kubelet[3281]: I0527 02:48:35.362262 3281 state_mem.go:75] "Updated machine memory state" May 27 02:48:35.363789 kubelet[3281]: E0527 02:48:35.363724 3281 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 27 02:48:35.375803 kubelet[3281]: I0527 02:48:35.375422 3281 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 02:48:35.376412 kubelet[3281]: I0527 02:48:35.376370 3281 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 02:48:35.376605 kubelet[3281]: I0527 02:48:35.376406 3281 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 02:48:35.379001 kubelet[3281]: I0527 02:48:35.378954 3281 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 02:48:35.381883 kubelet[3281]: E0527 02:48:35.381817 3281 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 02:48:35.503885 kubelet[3281]: I0527 02:48:35.503838 3281 kubelet_node_status.go:75] "Attempting to register node" node="ip-172-31-28-205" May 27 02:48:35.526208 kubelet[3281]: I0527 02:48:35.526157 3281 kubelet_node_status.go:124] "Node was previously registered" node="ip-172-31-28-205" May 27 02:48:35.526362 kubelet[3281]: I0527 02:48:35.526274 3281 kubelet_node_status.go:78] "Successfully registered node" node="ip-172-31-28-205" May 27 02:48:35.565986 kubelet[3281]: I0527 02:48:35.565887 3281 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:35.566585 kubelet[3281]: I0527 02:48:35.566496 3281 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:35.567498 kubelet[3281]: I0527 02:48:35.567456 3281 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-28-205" May 27 02:48:35.576995 kubelet[3281]: I0527 02:48:35.576930 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-ca-certs\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:35.577865 kubelet[3281]: I0527 02:48:35.577009 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:35.577865 kubelet[3281]: I0527 02:48:35.577057 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-kubeconfig\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:35.577865 kubelet[3281]: I0527 02:48:35.577113 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:35.577865 kubelet[3281]: I0527 02:48:35.577175 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e0aaf5f0c9e97457ee767475b70bb115-k8s-certs\") pod \"kube-controller-manager-ip-172-31-28-205\" (UID: \"e0aaf5f0c9e97457ee767475b70bb115\") " pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:35.577865 kubelet[3281]: I0527 02:48:35.577244 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/09870cd5e8f27cf098ce0a841fe2b08a-kubeconfig\") pod \"kube-scheduler-ip-172-31-28-205\" (UID: \"09870cd5e8f27cf098ce0a841fe2b08a\") " pod="kube-system/kube-scheduler-ip-172-31-28-205" May 27 02:48:35.578652 kubelet[3281]: I0527 02:48:35.578203 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/abfd19643ff8189c1addc853bacdf831-ca-certs\") pod \"kube-apiserver-ip-172-31-28-205\" (UID: \"abfd19643ff8189c1addc853bacdf831\") " pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:35.578652 kubelet[3281]: I0527 02:48:35.578311 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/abfd19643ff8189c1addc853bacdf831-k8s-certs\") pod \"kube-apiserver-ip-172-31-28-205\" (UID: \"abfd19643ff8189c1addc853bacdf831\") " pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:35.578652 kubelet[3281]: I0527 02:48:35.578376 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/abfd19643ff8189c1addc853bacdf831-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-28-205\" (UID: \"abfd19643ff8189c1addc853bacdf831\") " pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:35.587221 kubelet[3281]: E0527 02:48:35.587177 3281 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-28-205\" already exists" pod="kube-system/kube-apiserver-ip-172-31-28-205" May 27 02:48:36.087937 kubelet[3281]: I0527 02:48:36.087865 3281 apiserver.go:52] "Watching apiserver" May 27 02:48:36.175443 kubelet[3281]: I0527 02:48:36.175369 3281 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 02:48:36.308479 kubelet[3281]: I0527 02:48:36.308430 3281 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:36.324575 kubelet[3281]: E0527 02:48:36.324264 3281 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-28-205\" already exists" pod="kube-system/kube-controller-manager-ip-172-31-28-205" May 27 02:48:36.406116 kubelet[3281]: I0527 02:48:36.405944 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-28-205" podStartSLOduration=1.405922157 podStartE2EDuration="1.405922157s" podCreationTimestamp="2025-05-27 02:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:36.35411086 +0000 UTC m=+1.418983124" watchObservedRunningTime="2025-05-27 02:48:36.405922157 +0000 UTC m=+1.470794421" May 27 02:48:36.446753 kubelet[3281]: I0527 02:48:36.446670 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-28-205" podStartSLOduration=1.446649989 podStartE2EDuration="1.446649989s" podCreationTimestamp="2025-05-27 02:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:36.405863873 +0000 UTC m=+1.470736173" watchObservedRunningTime="2025-05-27 02:48:36.446649989 +0000 UTC m=+1.511522241" May 27 02:48:36.491793 kubelet[3281]: I0527 02:48:36.491627 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-28-205" podStartSLOduration=4.491604161 podStartE2EDuration="4.491604161s" podCreationTimestamp="2025-05-27 02:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:36.447631349 +0000 UTC m=+1.512503649" watchObservedRunningTime="2025-05-27 02:48:36.491604161 +0000 UTC m=+1.556476425" May 27 02:48:38.904781 kubelet[3281]: I0527 02:48:38.904717 3281 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 02:48:38.905847 kubelet[3281]: I0527 02:48:38.905648 3281 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 02:48:38.905961 containerd[2012]: time="2025-05-27T02:48:38.905316873Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 02:48:39.020553 systemd[1]: Created slice kubepods-besteffort-podb894626e_6c3f_4b1e_9535_2449777e0dd0.slice - libcontainer container kubepods-besteffort-podb894626e_6c3f_4b1e_9535_2449777e0dd0.slice. May 27 02:48:39.101778 kubelet[3281]: I0527 02:48:39.101702 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtmkw\" (UniqueName: \"kubernetes.io/projected/b894626e-6c3f-4b1e-9535-2449777e0dd0-kube-api-access-xtmkw\") pod \"kube-proxy-xkch8\" (UID: \"b894626e-6c3f-4b1e-9535-2449777e0dd0\") " pod="kube-system/kube-proxy-xkch8" May 27 02:48:39.101930 kubelet[3281]: I0527 02:48:39.101810 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b894626e-6c3f-4b1e-9535-2449777e0dd0-xtables-lock\") pod \"kube-proxy-xkch8\" (UID: \"b894626e-6c3f-4b1e-9535-2449777e0dd0\") " pod="kube-system/kube-proxy-xkch8" May 27 02:48:39.101930 kubelet[3281]: I0527 02:48:39.101854 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b894626e-6c3f-4b1e-9535-2449777e0dd0-kube-proxy\") pod \"kube-proxy-xkch8\" (UID: \"b894626e-6c3f-4b1e-9535-2449777e0dd0\") " pod="kube-system/kube-proxy-xkch8" May 27 02:48:39.101930 kubelet[3281]: I0527 02:48:39.101893 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b894626e-6c3f-4b1e-9535-2449777e0dd0-lib-modules\") pod \"kube-proxy-xkch8\" (UID: \"b894626e-6c3f-4b1e-9535-2449777e0dd0\") " pod="kube-system/kube-proxy-xkch8" May 27 02:48:39.213799 kubelet[3281]: E0527 02:48:39.213624 3281 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found May 27 02:48:39.213799 kubelet[3281]: E0527 02:48:39.213680 3281 projected.go:194] Error preparing data for projected volume kube-api-access-xtmkw for pod kube-system/kube-proxy-xkch8: configmap "kube-root-ca.crt" not found May 27 02:48:39.214263 kubelet[3281]: E0527 02:48:39.214219 3281 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b894626e-6c3f-4b1e-9535-2449777e0dd0-kube-api-access-xtmkw podName:b894626e-6c3f-4b1e-9535-2449777e0dd0 nodeName:}" failed. No retries permitted until 2025-05-27 02:48:39.714162555 +0000 UTC m=+4.779034819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xtmkw" (UniqueName: "kubernetes.io/projected/b894626e-6c3f-4b1e-9535-2449777e0dd0-kube-api-access-xtmkw") pod "kube-proxy-xkch8" (UID: "b894626e-6c3f-4b1e-9535-2449777e0dd0") : configmap "kube-root-ca.crt" not found May 27 02:48:39.806167 kubelet[3281]: E0527 02:48:39.806093 3281 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found May 27 02:48:39.806167 kubelet[3281]: E0527 02:48:39.806138 3281 projected.go:194] Error preparing data for projected volume kube-api-access-xtmkw for pod kube-system/kube-proxy-xkch8: configmap "kube-root-ca.crt" not found May 27 02:48:39.806615 kubelet[3281]: E0527 02:48:39.806228 3281 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b894626e-6c3f-4b1e-9535-2449777e0dd0-kube-api-access-xtmkw podName:b894626e-6c3f-4b1e-9535-2449777e0dd0 nodeName:}" failed. No retries permitted until 2025-05-27 02:48:40.806200246 +0000 UTC m=+5.871072510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xtmkw" (UniqueName: "kubernetes.io/projected/b894626e-6c3f-4b1e-9535-2449777e0dd0-kube-api-access-xtmkw") pod "kube-proxy-xkch8" (UID: "b894626e-6c3f-4b1e-9535-2449777e0dd0") : configmap "kube-root-ca.crt" not found May 27 02:48:39.997573 kubelet[3281]: W0527 02:48:39.997086 3281 reflector.go:569] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ip-172-31-28-205" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ip-172-31-28-205' and this object May 27 02:48:39.997573 kubelet[3281]: E0527 02:48:39.997156 3281 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:ip-172-31-28-205\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ip-172-31-28-205' and this object" logger="UnhandledError" May 27 02:48:40.000035 kubelet[3281]: I0527 02:48:39.998848 3281 status_manager.go:890] "Failed to get status for pod" podUID="f85d0e03-f590-48f1-ba05-542041e1ebc3" pod="tigera-operator/tigera-operator-844669ff44-spfvs" err="pods \"tigera-operator-844669ff44-spfvs\" is forbidden: User \"system:node:ip-172-31-28-205\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ip-172-31-28-205' and this object" May 27 02:48:40.007870 kubelet[3281]: I0527 02:48:40.007743 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f85d0e03-f590-48f1-ba05-542041e1ebc3-var-lib-calico\") pod \"tigera-operator-844669ff44-spfvs\" (UID: \"f85d0e03-f590-48f1-ba05-542041e1ebc3\") " pod="tigera-operator/tigera-operator-844669ff44-spfvs" May 27 02:48:40.007870 kubelet[3281]: I0527 02:48:40.007837 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cztfv\" (UniqueName: \"kubernetes.io/projected/f85d0e03-f590-48f1-ba05-542041e1ebc3-kube-api-access-cztfv\") pod \"tigera-operator-844669ff44-spfvs\" (UID: \"f85d0e03-f590-48f1-ba05-542041e1ebc3\") " pod="tigera-operator/tigera-operator-844669ff44-spfvs" May 27 02:48:40.011508 systemd[1]: Created slice kubepods-besteffort-podf85d0e03_f590_48f1_ba05_542041e1ebc3.slice - libcontainer container kubepods-besteffort-podf85d0e03_f590_48f1_ba05_542041e1ebc3.slice. May 27 02:48:40.318048 containerd[2012]: time="2025-05-27T02:48:40.317920472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-spfvs,Uid:f85d0e03-f590-48f1-ba05-542041e1ebc3,Namespace:tigera-operator,Attempt:0,}" May 27 02:48:40.360493 containerd[2012]: time="2025-05-27T02:48:40.360413780Z" level=info msg="connecting to shim 9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da" address="unix:///run/containerd/s/ce32d09926bbaaee38fb866e55ebeaf036e7cd4c414b369fe43cb25b036f6f6d" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:40.404062 systemd[1]: Started cri-containerd-9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da.scope - libcontainer container 9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da. May 27 02:48:40.475840 containerd[2012]: time="2025-05-27T02:48:40.475706301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-spfvs,Uid:f85d0e03-f590-48f1-ba05-542041e1ebc3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da\"" May 27 02:48:40.481694 containerd[2012]: time="2025-05-27T02:48:40.481639353Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 02:48:40.833292 containerd[2012]: time="2025-05-27T02:48:40.832924763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xkch8,Uid:b894626e-6c3f-4b1e-9535-2449777e0dd0,Namespace:kube-system,Attempt:0,}" May 27 02:48:40.876581 containerd[2012]: time="2025-05-27T02:48:40.876511151Z" level=info msg="connecting to shim b3acfc2410ac19cb43ae70642aadd536ba283a72abf188f4f61d23a3daf87180" address="unix:///run/containerd/s/24481619351ad32f7f11a4e690d24f8cc05d3f07e67c408b9868cee3008d91ce" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:40.918073 systemd[1]: Started cri-containerd-b3acfc2410ac19cb43ae70642aadd536ba283a72abf188f4f61d23a3daf87180.scope - libcontainer container b3acfc2410ac19cb43ae70642aadd536ba283a72abf188f4f61d23a3daf87180. May 27 02:48:40.968723 containerd[2012]: time="2025-05-27T02:48:40.968569847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xkch8,Uid:b894626e-6c3f-4b1e-9535-2449777e0dd0,Namespace:kube-system,Attempt:0,} returns sandbox id \"b3acfc2410ac19cb43ae70642aadd536ba283a72abf188f4f61d23a3daf87180\"" May 27 02:48:40.975054 containerd[2012]: time="2025-05-27T02:48:40.975005099Z" level=info msg="CreateContainer within sandbox \"b3acfc2410ac19cb43ae70642aadd536ba283a72abf188f4f61d23a3daf87180\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 02:48:40.999232 containerd[2012]: time="2025-05-27T02:48:40.999181751Z" level=info msg="Container f363132a4adfb9b80062f6a8bbdd29a98ae8361fe808a7db539a016c3998ed5e: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:41.017435 containerd[2012]: time="2025-05-27T02:48:41.017377628Z" level=info msg="CreateContainer within sandbox \"b3acfc2410ac19cb43ae70642aadd536ba283a72abf188f4f61d23a3daf87180\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f363132a4adfb9b80062f6a8bbdd29a98ae8361fe808a7db539a016c3998ed5e\"" May 27 02:48:41.019280 containerd[2012]: time="2025-05-27T02:48:41.018870080Z" level=info msg="StartContainer for \"f363132a4adfb9b80062f6a8bbdd29a98ae8361fe808a7db539a016c3998ed5e\"" May 27 02:48:41.022295 containerd[2012]: time="2025-05-27T02:48:41.022213148Z" level=info msg="connecting to shim f363132a4adfb9b80062f6a8bbdd29a98ae8361fe808a7db539a016c3998ed5e" address="unix:///run/containerd/s/24481619351ad32f7f11a4e690d24f8cc05d3f07e67c408b9868cee3008d91ce" protocol=ttrpc version=3 May 27 02:48:41.056066 systemd[1]: Started cri-containerd-f363132a4adfb9b80062f6a8bbdd29a98ae8361fe808a7db539a016c3998ed5e.scope - libcontainer container f363132a4adfb9b80062f6a8bbdd29a98ae8361fe808a7db539a016c3998ed5e. May 27 02:48:41.156881 containerd[2012]: time="2025-05-27T02:48:41.156712880Z" level=info msg="StartContainer for \"f363132a4adfb9b80062f6a8bbdd29a98ae8361fe808a7db539a016c3998ed5e\" returns successfully" May 27 02:48:41.381106 kubelet[3281]: I0527 02:48:41.380737 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-xkch8" podStartSLOduration=3.380714469 podStartE2EDuration="3.380714469s" podCreationTimestamp="2025-05-27 02:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:41.361030329 +0000 UTC m=+6.425902617" watchObservedRunningTime="2025-05-27 02:48:41.380714469 +0000 UTC m=+6.445586745" May 27 02:48:41.978555 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3929842618.mount: Deactivated successfully. May 27 02:48:43.330562 containerd[2012]: time="2025-05-27T02:48:43.330437147Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:43.332912 containerd[2012]: time="2025-05-27T02:48:43.332871899Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=22143480" May 27 02:48:43.334737 containerd[2012]: time="2025-05-27T02:48:43.334686419Z" level=info msg="ImageCreate event name:\"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:43.341203 containerd[2012]: time="2025-05-27T02:48:43.340683971Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:43.348607 containerd[2012]: time="2025-05-27T02:48:43.348531635Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"22139475\" in 2.866444658s" May 27 02:48:43.349260 containerd[2012]: time="2025-05-27T02:48:43.349017815Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\"" May 27 02:48:43.358378 containerd[2012]: time="2025-05-27T02:48:43.356943659Z" level=info msg="CreateContainer within sandbox \"9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 02:48:43.380444 containerd[2012]: time="2025-05-27T02:48:43.380370395Z" level=info msg="Container 148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:43.394941 containerd[2012]: time="2025-05-27T02:48:43.394878491Z" level=info msg="CreateContainer within sandbox \"9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\"" May 27 02:48:43.396121 containerd[2012]: time="2025-05-27T02:48:43.396060263Z" level=info msg="StartContainer for \"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\"" May 27 02:48:43.398230 containerd[2012]: time="2025-05-27T02:48:43.398098115Z" level=info msg="connecting to shim 148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac" address="unix:///run/containerd/s/ce32d09926bbaaee38fb866e55ebeaf036e7cd4c414b369fe43cb25b036f6f6d" protocol=ttrpc version=3 May 27 02:48:43.439084 systemd[1]: Started cri-containerd-148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac.scope - libcontainer container 148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac. May 27 02:48:43.497032 containerd[2012]: time="2025-05-27T02:48:43.496912164Z" level=info msg="StartContainer for \"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\" returns successfully" May 27 02:48:44.479912 update_engine[1979]: I20250527 02:48:44.479820 1979 update_attempter.cc:509] Updating boot flags... May 27 02:48:52.734521 sudo[2337]: pam_unix(sudo:session): session closed for user root May 27 02:48:52.758149 sshd[2336]: Connection closed by 139.178.68.195 port 46272 May 27 02:48:52.759186 sshd-session[2334]: pam_unix(sshd:session): session closed for user core May 27 02:48:52.771544 systemd[1]: sshd@6-172.31.28.205:22-139.178.68.195:46272.service: Deactivated successfully. May 27 02:48:52.778275 systemd[1]: session-7.scope: Deactivated successfully. May 27 02:48:52.780469 systemd[1]: session-7.scope: Consumed 9.225s CPU time, 231M memory peak. May 27 02:48:52.789288 systemd-logind[1978]: Session 7 logged out. Waiting for processes to exit. May 27 02:48:52.797065 systemd-logind[1978]: Removed session 7. May 27 02:48:59.863140 kubelet[3281]: I0527 02:48:59.862996 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-spfvs" podStartSLOduration=17.991141767 podStartE2EDuration="20.862915565s" podCreationTimestamp="2025-05-27 02:48:39 +0000 UTC" firstStartedPulling="2025-05-27 02:48:40.479293725 +0000 UTC m=+5.544166001" lastFinishedPulling="2025-05-27 02:48:43.351067535 +0000 UTC m=+8.415939799" observedRunningTime="2025-05-27 02:48:44.373225272 +0000 UTC m=+9.438097548" watchObservedRunningTime="2025-05-27 02:48:59.862915565 +0000 UTC m=+24.927787841" May 27 02:48:59.883086 systemd[1]: Created slice kubepods-besteffort-pod84908685_e8be_4851_99d5_2e0086b096dc.slice - libcontainer container kubepods-besteffort-pod84908685_e8be_4851_99d5_2e0086b096dc.slice. May 27 02:48:59.953803 kubelet[3281]: I0527 02:48:59.953420 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/84908685-e8be-4851-99d5-2e0086b096dc-typha-certs\") pod \"calico-typha-794d9876bb-d7dfn\" (UID: \"84908685-e8be-4851-99d5-2e0086b096dc\") " pod="calico-system/calico-typha-794d9876bb-d7dfn" May 27 02:48:59.954297 kubelet[3281]: I0527 02:48:59.954179 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5d2t\" (UniqueName: \"kubernetes.io/projected/84908685-e8be-4851-99d5-2e0086b096dc-kube-api-access-j5d2t\") pod \"calico-typha-794d9876bb-d7dfn\" (UID: \"84908685-e8be-4851-99d5-2e0086b096dc\") " pod="calico-system/calico-typha-794d9876bb-d7dfn" May 27 02:48:59.954565 kubelet[3281]: I0527 02:48:59.954452 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84908685-e8be-4851-99d5-2e0086b096dc-tigera-ca-bundle\") pod \"calico-typha-794d9876bb-d7dfn\" (UID: \"84908685-e8be-4851-99d5-2e0086b096dc\") " pod="calico-system/calico-typha-794d9876bb-d7dfn" May 27 02:49:00.147419 systemd[1]: Created slice kubepods-besteffort-pode0237c44_d785_435a_b313_e957a0a5fa3a.slice - libcontainer container kubepods-besteffort-pode0237c44_d785_435a_b313_e957a0a5fa3a.slice. May 27 02:49:00.156857 kubelet[3281]: I0527 02:49:00.156703 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-cni-log-dir\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.157293 kubelet[3281]: I0527 02:49:00.157191 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-var-lib-calico\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.157678 kubelet[3281]: I0527 02:49:00.157584 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-var-run-calico\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.158064 kubelet[3281]: I0527 02:49:00.157982 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-lib-modules\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.158438 kubelet[3281]: I0527 02:49:00.158301 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-policysync\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.158814 kubelet[3281]: I0527 02:49:00.158686 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e0237c44-d785-435a-b313-e957a0a5fa3a-node-certs\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.159413 kubelet[3281]: I0527 02:49:00.159311 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-xtables-lock\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.159803 kubelet[3281]: I0527 02:49:00.159602 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-cni-bin-dir\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.160737 kubelet[3281]: I0527 02:49:00.160146 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-cni-net-dir\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.161137 kubelet[3281]: I0527 02:49:00.161008 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e0237c44-d785-435a-b313-e957a0a5fa3a-flexvol-driver-host\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.161355 kubelet[3281]: I0527 02:49:00.161302 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0237c44-d785-435a-b313-e957a0a5fa3a-tigera-ca-bundle\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.161623 kubelet[3281]: I0527 02:49:00.161493 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj74j\" (UniqueName: \"kubernetes.io/projected/e0237c44-d785-435a-b313-e957a0a5fa3a-kube-api-access-jj74j\") pod \"calico-node-pfjmw\" (UID: \"e0237c44-d785-435a-b313-e957a0a5fa3a\") " pod="calico-system/calico-node-pfjmw" May 27 02:49:00.202916 containerd[2012]: time="2025-05-27T02:49:00.202456203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-794d9876bb-d7dfn,Uid:84908685-e8be-4851-99d5-2e0086b096dc,Namespace:calico-system,Attempt:0,}" May 27 02:49:00.264961 kubelet[3281]: E0527 02:49:00.264387 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.264961 kubelet[3281]: W0527 02:49:00.264431 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.264961 kubelet[3281]: E0527 02:49:00.264469 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.266104 kubelet[3281]: E0527 02:49:00.266056 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.266104 kubelet[3281]: W0527 02:49:00.266093 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.266353 kubelet[3281]: E0527 02:49:00.266139 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.268111 kubelet[3281]: E0527 02:49:00.268062 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.268111 kubelet[3281]: W0527 02:49:00.268099 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.268613 kubelet[3281]: E0527 02:49:00.268286 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.268750 kubelet[3281]: E0527 02:49:00.268689 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.268750 kubelet[3281]: W0527 02:49:00.268720 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.268750 kubelet[3281]: E0527 02:49:00.268858 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.270598 kubelet[3281]: E0527 02:49:00.270540 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.270598 kubelet[3281]: W0527 02:49:00.270583 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.271468 kubelet[3281]: E0527 02:49:00.270951 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.272322 kubelet[3281]: E0527 02:49:00.272276 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.272322 kubelet[3281]: W0527 02:49:00.272313 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.272590 kubelet[3281]: E0527 02:49:00.272411 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.275015 kubelet[3281]: E0527 02:49:00.274965 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.275015 kubelet[3281]: W0527 02:49:00.275003 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.275397 kubelet[3281]: E0527 02:49:00.275167 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.276191 kubelet[3281]: E0527 02:49:00.276148 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.276424 kubelet[3281]: W0527 02:49:00.276381 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.276424 kubelet[3281]: E0527 02:49:00.276457 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.277954 kubelet[3281]: E0527 02:49:00.277906 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.277954 kubelet[3281]: W0527 02:49:00.277944 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.278396 kubelet[3281]: E0527 02:49:00.278081 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.280796 kubelet[3281]: E0527 02:49:00.280493 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.280796 kubelet[3281]: W0527 02:49:00.280533 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.281148 kubelet[3281]: E0527 02:49:00.281053 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.284132 kubelet[3281]: E0527 02:49:00.283945 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.284132 kubelet[3281]: W0527 02:49:00.283985 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.284132 kubelet[3281]: E0527 02:49:00.284055 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.284858 kubelet[3281]: E0527 02:49:00.284552 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.284858 kubelet[3281]: W0527 02:49:00.284572 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.285018 containerd[2012]: time="2025-05-27T02:49:00.284915727Z" level=info msg="connecting to shim 0b6b1c4f98c5e89f6b53dd6fc919248ef38bed4fbcd52820ec1baab8473a0d04" address="unix:///run/containerd/s/12e2788da401c21422cf135e0b3b250874a80bbf5259a92c02484d3777bb7c04" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:00.285456 kubelet[3281]: E0527 02:49:00.284649 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.286138 kubelet[3281]: E0527 02:49:00.286090 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.286138 kubelet[3281]: W0527 02:49:00.286124 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.286626 kubelet[3281]: E0527 02:49:00.286197 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.288733 kubelet[3281]: E0527 02:49:00.288680 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.288733 kubelet[3281]: W0527 02:49:00.288717 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.289376 kubelet[3281]: E0527 02:49:00.289216 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.289748 kubelet[3281]: E0527 02:49:00.289618 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.289748 kubelet[3281]: W0527 02:49:00.289652 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.289748 kubelet[3281]: E0527 02:49:00.289710 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.291057 kubelet[3281]: E0527 02:49:00.290999 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.291057 kubelet[3281]: W0527 02:49:00.291039 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.291057 kubelet[3281]: E0527 02:49:00.291105 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.292888 kubelet[3281]: E0527 02:49:00.291627 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.292888 kubelet[3281]: W0527 02:49:00.291658 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.292888 kubelet[3281]: E0527 02:49:00.292726 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.293162 kubelet[3281]: E0527 02:49:00.293069 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.293162 kubelet[3281]: W0527 02:49:00.293094 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.293706 kubelet[3281]: E0527 02:49:00.293557 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.295369 kubelet[3281]: E0527 02:49:00.295312 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.295369 kubelet[3281]: W0527 02:49:00.295352 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.295369 kubelet[3281]: E0527 02:49:00.295421 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.295369 kubelet[3281]: E0527 02:49:00.295704 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.295369 kubelet[3281]: W0527 02:49:00.295722 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.296253 kubelet[3281]: E0527 02:49:00.296042 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.296253 kubelet[3281]: W0527 02:49:00.296059 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.296366 kubelet[3281]: E0527 02:49:00.296283 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.296366 kubelet[3281]: W0527 02:49:00.296297 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.297796 kubelet[3281]: E0527 02:49:00.297018 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.297796 kubelet[3281]: E0527 02:49:00.297092 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.297796 kubelet[3281]: E0527 02:49:00.297126 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.297796 kubelet[3281]: E0527 02:49:00.297197 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.297796 kubelet[3281]: W0527 02:49:00.297215 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.297796 kubelet[3281]: E0527 02:49:00.297379 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.298672 kubelet[3281]: E0527 02:49:00.298016 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.298672 kubelet[3281]: W0527 02:49:00.298041 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.298672 kubelet[3281]: E0527 02:49:00.298247 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.299715 kubelet[3281]: E0527 02:49:00.299672 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.299715 kubelet[3281]: W0527 02:49:00.299709 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.300234 kubelet[3281]: E0527 02:49:00.299812 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.301247 kubelet[3281]: E0527 02:49:00.301190 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.301247 kubelet[3281]: W0527 02:49:00.301229 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.301247 kubelet[3281]: E0527 02:49:00.301294 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.303233 kubelet[3281]: E0527 02:49:00.303185 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.303233 kubelet[3281]: W0527 02:49:00.303222 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.303785 kubelet[3281]: E0527 02:49:00.303508 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.303864 kubelet[3281]: E0527 02:49:00.303848 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.303914 kubelet[3281]: W0527 02:49:00.303868 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.306791 kubelet[3281]: E0527 02:49:00.305911 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.306791 kubelet[3281]: W0527 02:49:00.305952 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.308191 kubelet[3281]: E0527 02:49:00.308128 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.308559 kubelet[3281]: E0527 02:49:00.308523 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.344264 kubelet[3281]: E0527 02:49:00.343634 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.344264 kubelet[3281]: W0527 02:49:00.343669 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.344264 kubelet[3281]: E0527 02:49:00.343699 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.377051 systemd[1]: Started cri-containerd-0b6b1c4f98c5e89f6b53dd6fc919248ef38bed4fbcd52820ec1baab8473a0d04.scope - libcontainer container 0b6b1c4f98c5e89f6b53dd6fc919248ef38bed4fbcd52820ec1baab8473a0d04. May 27 02:49:00.391547 kubelet[3281]: E0527 02:49:00.390219 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.391547 kubelet[3281]: W0527 02:49:00.391543 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.392861 kubelet[3281]: E0527 02:49:00.392821 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.452812 kubelet[3281]: E0527 02:49:00.451056 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dsg5" podUID="62399f24-73bb-46f6-a534-e7c43ecd7271" May 27 02:49:00.458518 containerd[2012]: time="2025-05-27T02:49:00.458055016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pfjmw,Uid:e0237c44-d785-435a-b313-e957a0a5fa3a,Namespace:calico-system,Attempt:0,}" May 27 02:49:00.512161 containerd[2012]: time="2025-05-27T02:49:00.512079232Z" level=info msg="connecting to shim 37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f" address="unix:///run/containerd/s/6e5fa9fcf01a1df88c226e0c1eb640b0088629019c6a5f33952c5fbf90da400b" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:00.540437 kubelet[3281]: E0527 02:49:00.540380 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.540907 kubelet[3281]: W0527 02:49:00.540422 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.541012 kubelet[3281]: E0527 02:49:00.540916 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.541510 kubelet[3281]: E0527 02:49:00.541467 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.542957 kubelet[3281]: W0527 02:49:00.542826 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.543080 kubelet[3281]: E0527 02:49:00.542954 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.543623 kubelet[3281]: E0527 02:49:00.543581 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.543718 kubelet[3281]: W0527 02:49:00.543614 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.543718 kubelet[3281]: E0527 02:49:00.543665 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.546040 kubelet[3281]: E0527 02:49:00.544490 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.546040 kubelet[3281]: W0527 02:49:00.544601 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.546040 kubelet[3281]: E0527 02:49:00.544751 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.546040 kubelet[3281]: E0527 02:49:00.545661 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.546040 kubelet[3281]: W0527 02:49:00.545689 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.546040 kubelet[3281]: E0527 02:49:00.545740 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.547410 kubelet[3281]: E0527 02:49:00.546456 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.547410 kubelet[3281]: W0527 02:49:00.546492 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.547410 kubelet[3281]: E0527 02:49:00.546524 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.547976 kubelet[3281]: E0527 02:49:00.547932 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.547976 kubelet[3281]: W0527 02:49:00.547970 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.548133 kubelet[3281]: E0527 02:49:00.548005 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.549076 kubelet[3281]: E0527 02:49:00.548982 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.549076 kubelet[3281]: W0527 02:49:00.549019 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.549076 kubelet[3281]: E0527 02:49:00.549077 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.550091 kubelet[3281]: E0527 02:49:00.550042 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.550191 kubelet[3281]: W0527 02:49:00.550101 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.550191 kubelet[3281]: E0527 02:49:00.550133 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.551398 kubelet[3281]: E0527 02:49:00.551029 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.551398 kubelet[3281]: W0527 02:49:00.551093 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.551398 kubelet[3281]: E0527 02:49:00.551197 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.552851 kubelet[3281]: E0527 02:49:00.552073 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.552851 kubelet[3281]: W0527 02:49:00.552223 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.552851 kubelet[3281]: E0527 02:49:00.552254 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.553229 kubelet[3281]: E0527 02:49:00.553130 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.553229 kubelet[3281]: W0527 02:49:00.553210 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.553344 kubelet[3281]: E0527 02:49:00.553240 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.554169 kubelet[3281]: E0527 02:49:00.554120 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.554169 kubelet[3281]: W0527 02:49:00.554156 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.554658 kubelet[3281]: E0527 02:49:00.554185 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.555103 kubelet[3281]: E0527 02:49:00.555063 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.555103 kubelet[3281]: W0527 02:49:00.555098 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.555940 kubelet[3281]: E0527 02:49:00.555229 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.556029 kubelet[3281]: E0527 02:49:00.555945 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.556029 kubelet[3281]: W0527 02:49:00.555969 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.556029 kubelet[3281]: E0527 02:49:00.555998 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.556859 kubelet[3281]: E0527 02:49:00.556618 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.556859 kubelet[3281]: W0527 02:49:00.556787 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.556859 kubelet[3281]: E0527 02:49:00.556843 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.558253 kubelet[3281]: E0527 02:49:00.557464 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.558253 kubelet[3281]: W0527 02:49:00.557490 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.558253 kubelet[3281]: E0527 02:49:00.557520 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.558440 kubelet[3281]: E0527 02:49:00.558362 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.558440 kubelet[3281]: W0527 02:49:00.558384 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.558440 kubelet[3281]: E0527 02:49:00.558409 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.559433 kubelet[3281]: E0527 02:49:00.559284 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.559433 kubelet[3281]: W0527 02:49:00.559331 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.559433 kubelet[3281]: E0527 02:49:00.559361 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.560016 kubelet[3281]: E0527 02:49:00.559700 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.560016 kubelet[3281]: W0527 02:49:00.559720 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.560016 kubelet[3281]: E0527 02:49:00.559742 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.568198 kubelet[3281]: E0527 02:49:00.566845 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.568385 kubelet[3281]: W0527 02:49:00.568199 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.568385 kubelet[3281]: E0527 02:49:00.568362 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.568537 kubelet[3281]: I0527 02:49:00.568421 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/62399f24-73bb-46f6-a534-e7c43ecd7271-registration-dir\") pod \"csi-node-driver-9dsg5\" (UID: \"62399f24-73bb-46f6-a534-e7c43ecd7271\") " pod="calico-system/csi-node-driver-9dsg5" May 27 02:49:00.569033 kubelet[3281]: E0527 02:49:00.568964 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.569033 kubelet[3281]: W0527 02:49:00.569004 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.569033 kubelet[3281]: E0527 02:49:00.569034 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.570241 kubelet[3281]: I0527 02:49:00.569078 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jl5n\" (UniqueName: \"kubernetes.io/projected/62399f24-73bb-46f6-a534-e7c43ecd7271-kube-api-access-5jl5n\") pod \"csi-node-driver-9dsg5\" (UID: \"62399f24-73bb-46f6-a534-e7c43ecd7271\") " pod="calico-system/csi-node-driver-9dsg5" May 27 02:49:00.570241 kubelet[3281]: E0527 02:49:00.569633 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.570241 kubelet[3281]: W0527 02:49:00.569658 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.570241 kubelet[3281]: E0527 02:49:00.569683 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.570241 kubelet[3281]: I0527 02:49:00.569723 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/62399f24-73bb-46f6-a534-e7c43ecd7271-varrun\") pod \"csi-node-driver-9dsg5\" (UID: \"62399f24-73bb-46f6-a534-e7c43ecd7271\") " pod="calico-system/csi-node-driver-9dsg5" May 27 02:49:00.571651 kubelet[3281]: E0527 02:49:00.570996 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.572315 kubelet[3281]: W0527 02:49:00.571955 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.572315 kubelet[3281]: E0527 02:49:00.572074 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.573928 kubelet[3281]: E0527 02:49:00.573890 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.574805 kubelet[3281]: W0527 02:49:00.574369 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.574805 kubelet[3281]: E0527 02:49:00.574516 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.577561 kubelet[3281]: E0527 02:49:00.577173 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.577561 kubelet[3281]: W0527 02:49:00.577285 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.577561 kubelet[3281]: E0527 02:49:00.577354 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.577561 kubelet[3281]: I0527 02:49:00.577408 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/62399f24-73bb-46f6-a534-e7c43ecd7271-kubelet-dir\") pod \"csi-node-driver-9dsg5\" (UID: \"62399f24-73bb-46f6-a534-e7c43ecd7271\") " pod="calico-system/csi-node-driver-9dsg5" May 27 02:49:00.579822 kubelet[3281]: E0527 02:49:00.578702 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.579822 kubelet[3281]: W0527 02:49:00.578737 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.580228 kubelet[3281]: E0527 02:49:00.580097 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.580812 kubelet[3281]: E0527 02:49:00.580753 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.582213 kubelet[3281]: W0527 02:49:00.581849 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.582213 kubelet[3281]: E0527 02:49:00.581934 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.582518 kubelet[3281]: E0527 02:49:00.582462 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.583952 kubelet[3281]: W0527 02:49:00.582587 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.583952 kubelet[3281]: E0527 02:49:00.582640 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.583952 kubelet[3281]: I0527 02:49:00.582701 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/62399f24-73bb-46f6-a534-e7c43ecd7271-socket-dir\") pod \"csi-node-driver-9dsg5\" (UID: \"62399f24-73bb-46f6-a534-e7c43ecd7271\") " pod="calico-system/csi-node-driver-9dsg5" May 27 02:49:00.584499 kubelet[3281]: E0527 02:49:00.584373 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.584499 kubelet[3281]: W0527 02:49:00.584443 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.585816 kubelet[3281]: E0527 02:49:00.584760 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.587067 kubelet[3281]: E0527 02:49:00.586464 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.587067 kubelet[3281]: W0527 02:49:00.586533 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.587067 kubelet[3281]: E0527 02:49:00.586577 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.587067 kubelet[3281]: E0527 02:49:00.586994 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.587067 kubelet[3281]: W0527 02:49:00.587016 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.587067 kubelet[3281]: E0527 02:49:00.587053 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.587417 kubelet[3281]: E0527 02:49:00.587330 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.587417 kubelet[3281]: W0527 02:49:00.587347 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.587417 kubelet[3281]: E0527 02:49:00.587368 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.588064 kubelet[3281]: E0527 02:49:00.587617 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.588064 kubelet[3281]: W0527 02:49:00.587644 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.588064 kubelet[3281]: E0527 02:49:00.587666 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.588064 kubelet[3281]: E0527 02:49:00.588004 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.588064 kubelet[3281]: W0527 02:49:00.588020 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.588064 kubelet[3281]: E0527 02:49:00.588041 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.616048 systemd[1]: Started cri-containerd-37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f.scope - libcontainer container 37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f. May 27 02:49:00.689895 kubelet[3281]: E0527 02:49:00.689815 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.689895 kubelet[3281]: W0527 02:49:00.689876 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.690065 kubelet[3281]: E0527 02:49:00.689912 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.690676 kubelet[3281]: E0527 02:49:00.690463 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.690676 kubelet[3281]: W0527 02:49:00.690544 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.690676 kubelet[3281]: E0527 02:49:00.690574 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.691624 kubelet[3281]: E0527 02:49:00.691573 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.691624 kubelet[3281]: W0527 02:49:00.691612 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.691758 kubelet[3281]: E0527 02:49:00.691654 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.692929 kubelet[3281]: E0527 02:49:00.692878 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.692929 kubelet[3281]: W0527 02:49:00.692921 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.693123 kubelet[3281]: E0527 02:49:00.693062 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.695524 kubelet[3281]: E0527 02:49:00.693550 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.695524 kubelet[3281]: W0527 02:49:00.693583 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.695524 kubelet[3281]: E0527 02:49:00.693897 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.695524 kubelet[3281]: E0527 02:49:00.694705 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.695524 kubelet[3281]: W0527 02:49:00.694731 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.695855 kubelet[3281]: E0527 02:49:00.695703 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.696150 kubelet[3281]: E0527 02:49:00.696107 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.696150 kubelet[3281]: W0527 02:49:00.696141 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.696273 kubelet[3281]: E0527 02:49:00.696238 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.696902 kubelet[3281]: E0527 02:49:00.696497 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.696902 kubelet[3281]: W0527 02:49:00.696526 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.696902 kubelet[3281]: E0527 02:49:00.696754 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.697222 kubelet[3281]: E0527 02:49:00.697185 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.697222 kubelet[3281]: W0527 02:49:00.697215 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.697487 kubelet[3281]: E0527 02:49:00.697445 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.698219 kubelet[3281]: E0527 02:49:00.698170 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.699047 kubelet[3281]: W0527 02:49:00.698799 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.699047 kubelet[3281]: E0527 02:49:00.698962 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.699382 kubelet[3281]: E0527 02:49:00.699337 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.699382 kubelet[3281]: W0527 02:49:00.699367 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.699684 kubelet[3281]: E0527 02:49:00.699644 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.700040 kubelet[3281]: E0527 02:49:00.699943 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.700040 kubelet[3281]: W0527 02:49:00.699971 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.700984 kubelet[3281]: E0527 02:49:00.700838 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.701285 kubelet[3281]: E0527 02:49:00.701243 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.701357 kubelet[3281]: W0527 02:49:00.701309 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.701747 kubelet[3281]: E0527 02:49:00.701425 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.702518 kubelet[3281]: E0527 02:49:00.702467 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.702518 kubelet[3281]: W0527 02:49:00.702502 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.704819 kubelet[3281]: E0527 02:49:00.702648 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.704819 kubelet[3281]: E0527 02:49:00.703003 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.704819 kubelet[3281]: W0527 02:49:00.703022 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.704819 kubelet[3281]: E0527 02:49:00.703401 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.704819 kubelet[3281]: E0527 02:49:00.703652 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.704819 kubelet[3281]: W0527 02:49:00.703669 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.704819 kubelet[3281]: E0527 02:49:00.703900 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.704819 kubelet[3281]: E0527 02:49:00.704605 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.704819 kubelet[3281]: W0527 02:49:00.704630 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.710460 kubelet[3281]: E0527 02:49:00.705569 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.710460 kubelet[3281]: E0527 02:49:00.706062 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.710460 kubelet[3281]: W0527 02:49:00.706085 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.710460 kubelet[3281]: E0527 02:49:00.706192 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.710460 kubelet[3281]: E0527 02:49:00.706462 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.710460 kubelet[3281]: W0527 02:49:00.706479 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.710460 kubelet[3281]: E0527 02:49:00.706711 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.710460 kubelet[3281]: E0527 02:49:00.708031 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.710460 kubelet[3281]: W0527 02:49:00.708060 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.710460 kubelet[3281]: E0527 02:49:00.708225 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.711057 kubelet[3281]: E0527 02:49:00.708724 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.711057 kubelet[3281]: W0527 02:49:00.708746 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.711057 kubelet[3281]: E0527 02:49:00.709113 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.711057 kubelet[3281]: E0527 02:49:00.709703 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.711057 kubelet[3281]: W0527 02:49:00.709728 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.711057 kubelet[3281]: E0527 02:49:00.710087 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.711332 kubelet[3281]: E0527 02:49:00.711168 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.711332 kubelet[3281]: W0527 02:49:00.711201 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.715121 kubelet[3281]: E0527 02:49:00.711534 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.715121 kubelet[3281]: E0527 02:49:00.712516 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.715121 kubelet[3281]: W0527 02:49:00.712575 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.715121 kubelet[3281]: E0527 02:49:00.712653 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.715121 kubelet[3281]: E0527 02:49:00.714105 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.715121 kubelet[3281]: W0527 02:49:00.714625 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.715121 kubelet[3281]: E0527 02:49:00.714668 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.748819 containerd[2012]: time="2025-05-27T02:49:00.748637286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-794d9876bb-d7dfn,Uid:84908685-e8be-4851-99d5-2e0086b096dc,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b6b1c4f98c5e89f6b53dd6fc919248ef38bed4fbcd52820ec1baab8473a0d04\"" May 27 02:49:00.753163 containerd[2012]: time="2025-05-27T02:49:00.753094134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 02:49:00.754085 kubelet[3281]: E0527 02:49:00.754013 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:00.754476 kubelet[3281]: W0527 02:49:00.754048 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:00.754756 kubelet[3281]: E0527 02:49:00.754197 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:00.831226 containerd[2012]: time="2025-05-27T02:49:00.831093954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pfjmw,Uid:e0237c44-d785-435a-b313-e957a0a5fa3a,Namespace:calico-system,Attempt:0,} returns sandbox id \"37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f\"" May 27 02:49:02.097303 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3195494748.mount: Deactivated successfully. May 27 02:49:02.255845 kubelet[3281]: E0527 02:49:02.255741 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dsg5" podUID="62399f24-73bb-46f6-a534-e7c43ecd7271" May 27 02:49:03.129412 containerd[2012]: time="2025-05-27T02:49:03.129333929Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:03.130651 containerd[2012]: time="2025-05-27T02:49:03.130526261Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=33020269" May 27 02:49:03.131448 containerd[2012]: time="2025-05-27T02:49:03.131389073Z" level=info msg="ImageCreate event name:\"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:03.134795 containerd[2012]: time="2025-05-27T02:49:03.134646089Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:03.138375 containerd[2012]: time="2025-05-27T02:49:03.138015113Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"33020123\" in 2.384714855s" May 27 02:49:03.138375 containerd[2012]: time="2025-05-27T02:49:03.138102389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\"" May 27 02:49:03.142383 containerd[2012]: time="2025-05-27T02:49:03.142309625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 02:49:03.175949 containerd[2012]: time="2025-05-27T02:49:03.175886298Z" level=info msg="CreateContainer within sandbox \"0b6b1c4f98c5e89f6b53dd6fc919248ef38bed4fbcd52820ec1baab8473a0d04\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 02:49:03.190074 containerd[2012]: time="2025-05-27T02:49:03.189143118Z" level=info msg="Container 00b1bf48415341ff503cd4ccfa8eec95f90d2c975ee25efbffbf2ed95dd3578d: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:03.194502 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3137902935.mount: Deactivated successfully. May 27 02:49:03.208619 containerd[2012]: time="2025-05-27T02:49:03.208464534Z" level=info msg="CreateContainer within sandbox \"0b6b1c4f98c5e89f6b53dd6fc919248ef38bed4fbcd52820ec1baab8473a0d04\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"00b1bf48415341ff503cd4ccfa8eec95f90d2c975ee25efbffbf2ed95dd3578d\"" May 27 02:49:03.209662 containerd[2012]: time="2025-05-27T02:49:03.209486418Z" level=info msg="StartContainer for \"00b1bf48415341ff503cd4ccfa8eec95f90d2c975ee25efbffbf2ed95dd3578d\"" May 27 02:49:03.212997 containerd[2012]: time="2025-05-27T02:49:03.212943546Z" level=info msg="connecting to shim 00b1bf48415341ff503cd4ccfa8eec95f90d2c975ee25efbffbf2ed95dd3578d" address="unix:///run/containerd/s/12e2788da401c21422cf135e0b3b250874a80bbf5259a92c02484d3777bb7c04" protocol=ttrpc version=3 May 27 02:49:03.261885 systemd[1]: Started cri-containerd-00b1bf48415341ff503cd4ccfa8eec95f90d2c975ee25efbffbf2ed95dd3578d.scope - libcontainer container 00b1bf48415341ff503cd4ccfa8eec95f90d2c975ee25efbffbf2ed95dd3578d. May 27 02:49:03.418182 containerd[2012]: time="2025-05-27T02:49:03.417947143Z" level=info msg="StartContainer for \"00b1bf48415341ff503cd4ccfa8eec95f90d2c975ee25efbffbf2ed95dd3578d\" returns successfully" May 27 02:49:03.482567 kubelet[3281]: E0527 02:49:03.482527 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.483832 kubelet[3281]: W0527 02:49:03.483197 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.483832 kubelet[3281]: E0527 02:49:03.483242 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.484800 kubelet[3281]: E0527 02:49:03.484300 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.485390 kubelet[3281]: W0527 02:49:03.485011 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.485390 kubelet[3281]: E0527 02:49:03.485075 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.486839 kubelet[3281]: E0527 02:49:03.486802 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.486997 kubelet[3281]: W0527 02:49:03.486971 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.487109 kubelet[3281]: E0527 02:49:03.487086 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.487797 kubelet[3281]: E0527 02:49:03.487550 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.487797 kubelet[3281]: W0527 02:49:03.487576 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.487797 kubelet[3281]: E0527 02:49:03.487599 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.489092 kubelet[3281]: E0527 02:49:03.488960 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.490744 kubelet[3281]: W0527 02:49:03.490226 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.490744 kubelet[3281]: E0527 02:49:03.490284 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.491405 kubelet[3281]: E0527 02:49:03.491101 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.491625 kubelet[3281]: W0527 02:49:03.491591 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.492933 kubelet[3281]: E0527 02:49:03.491988 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.495941 kubelet[3281]: E0527 02:49:03.495886 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.495941 kubelet[3281]: W0527 02:49:03.495927 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.496103 kubelet[3281]: E0527 02:49:03.495962 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.496907 kubelet[3281]: E0527 02:49:03.496331 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.496907 kubelet[3281]: W0527 02:49:03.496357 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.496907 kubelet[3281]: E0527 02:49:03.496380 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.496907 kubelet[3281]: E0527 02:49:03.496721 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.496907 kubelet[3281]: W0527 02:49:03.496737 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.496907 kubelet[3281]: E0527 02:49:03.496758 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.497269 kubelet[3281]: E0527 02:49:03.497171 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.497269 kubelet[3281]: W0527 02:49:03.497189 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.497269 kubelet[3281]: E0527 02:49:03.497212 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.498787 kubelet[3281]: E0527 02:49:03.497499 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.498787 kubelet[3281]: W0527 02:49:03.497525 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.498787 kubelet[3281]: E0527 02:49:03.497546 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.498787 kubelet[3281]: E0527 02:49:03.497969 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.498787 kubelet[3281]: W0527 02:49:03.497989 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.498787 kubelet[3281]: E0527 02:49:03.498011 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.498787 kubelet[3281]: E0527 02:49:03.498311 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.498787 kubelet[3281]: W0527 02:49:03.498325 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.498787 kubelet[3281]: E0527 02:49:03.498342 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.498787 kubelet[3281]: E0527 02:49:03.498615 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.499321 kubelet[3281]: W0527 02:49:03.498629 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.499321 kubelet[3281]: E0527 02:49:03.498645 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.499414 kubelet[3281]: E0527 02:49:03.499371 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.499414 kubelet[3281]: W0527 02:49:03.499391 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.499503 kubelet[3281]: E0527 02:49:03.499414 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.514509 kubelet[3281]: I0527 02:49:03.514020 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-794d9876bb-d7dfn" podStartSLOduration=2.126555328 podStartE2EDuration="4.513993187s" podCreationTimestamp="2025-05-27 02:48:59 +0000 UTC" firstStartedPulling="2025-05-27 02:49:00.752370858 +0000 UTC m=+25.817243122" lastFinishedPulling="2025-05-27 02:49:03.139808729 +0000 UTC m=+28.204680981" observedRunningTime="2025-05-27 02:49:03.507725731 +0000 UTC m=+28.572597995" watchObservedRunningTime="2025-05-27 02:49:03.513993187 +0000 UTC m=+28.578865475" May 27 02:49:03.521148 kubelet[3281]: E0527 02:49:03.521088 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.521865 kubelet[3281]: W0527 02:49:03.521813 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.521973 kubelet[3281]: E0527 02:49:03.521902 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.522858 kubelet[3281]: E0527 02:49:03.522467 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.522858 kubelet[3281]: W0527 02:49:03.522857 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.523063 kubelet[3281]: E0527 02:49:03.523023 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.524556 kubelet[3281]: E0527 02:49:03.524497 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.524556 kubelet[3281]: W0527 02:49:03.524557 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.524713 kubelet[3281]: E0527 02:49:03.524665 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.525653 kubelet[3281]: E0527 02:49:03.525578 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.525653 kubelet[3281]: W0527 02:49:03.525618 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.525881 kubelet[3281]: E0527 02:49:03.525686 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.527734 kubelet[3281]: E0527 02:49:03.527158 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.528270 kubelet[3281]: W0527 02:49:03.527220 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.528270 kubelet[3281]: E0527 02:49:03.528002 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.528579 kubelet[3281]: E0527 02:49:03.528540 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.528860 kubelet[3281]: W0527 02:49:03.528831 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.529597 kubelet[3281]: E0527 02:49:03.529548 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.530313 kubelet[3281]: E0527 02:49:03.530280 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.530564 kubelet[3281]: W0527 02:49:03.530535 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.530861 kubelet[3281]: E0527 02:49:03.530817 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.531400 kubelet[3281]: E0527 02:49:03.531370 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.531650 kubelet[3281]: W0527 02:49:03.531504 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.531650 kubelet[3281]: E0527 02:49:03.531568 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.533387 kubelet[3281]: E0527 02:49:03.533350 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.533753 kubelet[3281]: W0527 02:49:03.533551 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.533753 kubelet[3281]: E0527 02:49:03.533634 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.534134 kubelet[3281]: E0527 02:49:03.534110 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.534257 kubelet[3281]: W0527 02:49:03.534230 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.534423 kubelet[3281]: E0527 02:49:03.534375 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.534837 kubelet[3281]: E0527 02:49:03.534809 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.535034 kubelet[3281]: W0527 02:49:03.534946 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.535034 kubelet[3281]: E0527 02:49:03.534997 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.535539 kubelet[3281]: E0527 02:49:03.535498 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.535539 kubelet[3281]: W0527 02:49:03.535534 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.535861 kubelet[3281]: E0527 02:49:03.535575 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.536838 kubelet[3281]: E0527 02:49:03.536787 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.536838 kubelet[3281]: W0527 02:49:03.536833 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.537041 kubelet[3281]: E0527 02:49:03.536894 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.538108 kubelet[3281]: E0527 02:49:03.538059 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.538108 kubelet[3281]: W0527 02:49:03.538098 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.538572 kubelet[3281]: E0527 02:49:03.538143 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.539947 kubelet[3281]: E0527 02:49:03.539893 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.539947 kubelet[3281]: W0527 02:49:03.539934 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.540123 kubelet[3281]: E0527 02:49:03.539981 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.541425 kubelet[3281]: E0527 02:49:03.541362 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.541425 kubelet[3281]: W0527 02:49:03.541411 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.542198 kubelet[3281]: E0527 02:49:03.542165 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.542639 kubelet[3281]: E0527 02:49:03.542596 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.542639 kubelet[3281]: W0527 02:49:03.542637 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.542843 kubelet[3281]: E0527 02:49:03.542667 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:03.544691 kubelet[3281]: E0527 02:49:03.544643 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:03.544691 kubelet[3281]: W0527 02:49:03.544679 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:03.544988 kubelet[3281]: E0527 02:49:03.544711 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.255028 kubelet[3281]: E0527 02:49:04.254959 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dsg5" podUID="62399f24-73bb-46f6-a534-e7c43ecd7271" May 27 02:49:04.445117 containerd[2012]: time="2025-05-27T02:49:04.445037564Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:04.447021 containerd[2012]: time="2025-05-27T02:49:04.446947916Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4264304" May 27 02:49:04.449320 containerd[2012]: time="2025-05-27T02:49:04.449243408Z" level=info msg="ImageCreate event name:\"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:04.454990 containerd[2012]: time="2025-05-27T02:49:04.453467384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:04.455789 containerd[2012]: time="2025-05-27T02:49:04.455225024Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5633505\" in 1.312645807s" May 27 02:49:04.456069 containerd[2012]: time="2025-05-27T02:49:04.456031076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\"" May 27 02:49:04.463784 containerd[2012]: time="2025-05-27T02:49:04.463699892Z" level=info msg="CreateContainer within sandbox \"37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 02:49:04.466524 kubelet[3281]: I0527 02:49:04.466279 3281 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:04.484796 containerd[2012]: time="2025-05-27T02:49:04.482966924Z" level=info msg="Container c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:04.504967 containerd[2012]: time="2025-05-27T02:49:04.504882764Z" level=info msg="CreateContainer within sandbox \"37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca\"" May 27 02:49:04.507515 containerd[2012]: time="2025-05-27T02:49:04.506003912Z" level=info msg="StartContainer for \"c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca\"" May 27 02:49:04.508322 kubelet[3281]: E0527 02:49:04.508280 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.508902 kubelet[3281]: W0527 02:49:04.508314 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.508902 kubelet[3281]: E0527 02:49:04.508369 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.509659 kubelet[3281]: E0527 02:49:04.509477 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.509752 kubelet[3281]: W0527 02:49:04.509514 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.509971 kubelet[3281]: E0527 02:49:04.509711 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.511224 kubelet[3281]: E0527 02:49:04.510621 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.511224 kubelet[3281]: W0527 02:49:04.510655 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.511224 kubelet[3281]: E0527 02:49:04.510720 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.512584 kubelet[3281]: E0527 02:49:04.511403 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.512584 kubelet[3281]: W0527 02:49:04.511428 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.512584 kubelet[3281]: E0527 02:49:04.511516 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.512584 kubelet[3281]: E0527 02:49:04.512497 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.512584 kubelet[3281]: W0527 02:49:04.512523 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.512584 kubelet[3281]: E0527 02:49:04.512554 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.513425 kubelet[3281]: E0527 02:49:04.513388 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.513425 kubelet[3281]: W0527 02:49:04.513422 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.513565 kubelet[3281]: E0527 02:49:04.513452 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.514253 containerd[2012]: time="2025-05-27T02:49:04.514110800Z" level=info msg="connecting to shim c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca" address="unix:///run/containerd/s/6e5fa9fcf01a1df88c226e0c1eb640b0088629019c6a5f33952c5fbf90da400b" protocol=ttrpc version=3 May 27 02:49:04.515379 kubelet[3281]: E0527 02:49:04.515070 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.515379 kubelet[3281]: W0527 02:49:04.515334 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.515621 kubelet[3281]: E0527 02:49:04.515498 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.517049 kubelet[3281]: E0527 02:49:04.516990 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.517462 kubelet[3281]: W0527 02:49:04.517239 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.517742 kubelet[3281]: E0527 02:49:04.517285 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.519188 kubelet[3281]: E0527 02:49:04.519129 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.519188 kubelet[3281]: W0527 02:49:04.519168 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.519638 kubelet[3281]: E0527 02:49:04.519201 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.521006 kubelet[3281]: E0527 02:49:04.520951 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.521006 kubelet[3281]: W0527 02:49:04.520995 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.521402 kubelet[3281]: E0527 02:49:04.521259 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.525978 kubelet[3281]: E0527 02:49:04.525925 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.527198 kubelet[3281]: W0527 02:49:04.525967 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.527288 kubelet[3281]: E0527 02:49:04.527209 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.531168 kubelet[3281]: E0527 02:49:04.529585 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.531168 kubelet[3281]: W0527 02:49:04.529624 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.531168 kubelet[3281]: E0527 02:49:04.529657 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.538909 kubelet[3281]: E0527 02:49:04.538862 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.538909 kubelet[3281]: W0527 02:49:04.538899 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.539095 kubelet[3281]: E0527 02:49:04.538934 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.539607 kubelet[3281]: E0527 02:49:04.539569 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.539607 kubelet[3281]: W0527 02:49:04.539601 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.540481 kubelet[3281]: E0527 02:49:04.539632 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.540481 kubelet[3281]: E0527 02:49:04.540102 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.540481 kubelet[3281]: W0527 02:49:04.540123 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.540481 kubelet[3281]: E0527 02:49:04.540149 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.542157 kubelet[3281]: E0527 02:49:04.542023 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.542157 kubelet[3281]: W0527 02:49:04.542061 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.542157 kubelet[3281]: E0527 02:49:04.542094 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.543364 kubelet[3281]: E0527 02:49:04.543266 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.543620 kubelet[3281]: W0527 02:49:04.543305 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.543620 kubelet[3281]: E0527 02:49:04.543577 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.545252 kubelet[3281]: E0527 02:49:04.544474 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.545252 kubelet[3281]: W0527 02:49:04.544863 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.545252 kubelet[3281]: E0527 02:49:04.544946 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.546036 kubelet[3281]: E0527 02:49:04.545987 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.546229 kubelet[3281]: W0527 02:49:04.546160 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.546292 kubelet[3281]: E0527 02:49:04.546237 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.547320 kubelet[3281]: E0527 02:49:04.546846 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.547320 kubelet[3281]: W0527 02:49:04.547181 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.547320 kubelet[3281]: E0527 02:49:04.547248 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.548051 kubelet[3281]: E0527 02:49:04.548011 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.548236 kubelet[3281]: W0527 02:49:04.548155 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.548297 kubelet[3281]: E0527 02:49:04.548241 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.548897 kubelet[3281]: E0527 02:49:04.548847 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.549117 kubelet[3281]: W0527 02:49:04.549034 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.549117 kubelet[3281]: E0527 02:49:04.549101 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.549789 kubelet[3281]: E0527 02:49:04.549719 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.549909 kubelet[3281]: W0527 02:49:04.549885 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.550093 kubelet[3281]: E0527 02:49:04.550023 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.550633 kubelet[3281]: E0527 02:49:04.550583 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.550848 kubelet[3281]: W0527 02:49:04.550734 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.551088 kubelet[3281]: E0527 02:49:04.550997 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.551243 kubelet[3281]: E0527 02:49:04.551211 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.551386 kubelet[3281]: W0527 02:49:04.551238 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.551386 kubelet[3281]: E0527 02:49:04.551290 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.551830 kubelet[3281]: E0527 02:49:04.551687 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.551830 kubelet[3281]: W0527 02:49:04.551715 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.552187 kubelet[3281]: E0527 02:49:04.552063 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.552187 kubelet[3281]: W0527 02:49:04.552091 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.552187 kubelet[3281]: E0527 02:49:04.552120 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.552657 kubelet[3281]: E0527 02:49:04.552547 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.552657 kubelet[3281]: W0527 02:49:04.552575 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.552657 kubelet[3281]: E0527 02:49:04.552600 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.553495 kubelet[3281]: E0527 02:49:04.553365 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.553495 kubelet[3281]: W0527 02:49:04.553408 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.553495 kubelet[3281]: E0527 02:49:04.553433 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.553859 kubelet[3281]: E0527 02:49:04.553364 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.554018 kubelet[3281]: E0527 02:49:04.553859 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.554018 kubelet[3281]: W0527 02:49:04.553876 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.554018 kubelet[3281]: E0527 02:49:04.553911 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.554956 kubelet[3281]: E0527 02:49:04.554924 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.555164 kubelet[3281]: W0527 02:49:04.554955 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.555164 kubelet[3281]: E0527 02:49:04.554988 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.555853 kubelet[3281]: E0527 02:49:04.555746 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.555853 kubelet[3281]: W0527 02:49:04.555785 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.555853 kubelet[3281]: E0527 02:49:04.555811 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.556942 kubelet[3281]: E0527 02:49:04.556894 3281 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:49:04.556942 kubelet[3281]: W0527 02:49:04.556937 3281 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:49:04.557136 kubelet[3281]: E0527 02:49:04.556965 3281 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:49:04.578133 systemd[1]: Started cri-containerd-c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca.scope - libcontainer container c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca. May 27 02:49:04.659843 containerd[2012]: time="2025-05-27T02:49:04.659427033Z" level=info msg="StartContainer for \"c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca\" returns successfully" May 27 02:49:04.683182 systemd[1]: cri-containerd-c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca.scope: Deactivated successfully. May 27 02:49:04.690818 containerd[2012]: time="2025-05-27T02:49:04.690713985Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca\" id:\"c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca\" pid:4196 exited_at:{seconds:1748314144 nanos:690066441}" May 27 02:49:04.690818 containerd[2012]: time="2025-05-27T02:49:04.690743733Z" level=info msg="received exit event container_id:\"c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca\" id:\"c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca\" pid:4196 exited_at:{seconds:1748314144 nanos:690066441}" May 27 02:49:04.741339 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c662b3b36b5417ab58a8c7de48640f223ce5c7c4f6d1725ec596917adc4caaca-rootfs.mount: Deactivated successfully. May 27 02:49:05.476803 containerd[2012]: time="2025-05-27T02:49:05.476168805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 02:49:06.255532 kubelet[3281]: E0527 02:49:06.255432 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dsg5" podUID="62399f24-73bb-46f6-a534-e7c43ecd7271" May 27 02:49:07.958489 kubelet[3281]: I0527 02:49:07.958444 3281 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:08.255856 kubelet[3281]: E0527 02:49:08.255797 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9dsg5" podUID="62399f24-73bb-46f6-a534-e7c43ecd7271" May 27 02:49:08.332232 containerd[2012]: time="2025-05-27T02:49:08.332178635Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:08.334573 containerd[2012]: time="2025-05-27T02:49:08.334491251Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=65748976" May 27 02:49:08.336810 containerd[2012]: time="2025-05-27T02:49:08.335954459Z" level=info msg="ImageCreate event name:\"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:08.342870 containerd[2012]: time="2025-05-27T02:49:08.340955543Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:08.348803 containerd[2012]: time="2025-05-27T02:49:08.348342191Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"67118217\" in 2.872111598s" May 27 02:49:08.348803 containerd[2012]: time="2025-05-27T02:49:08.348400043Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\"" May 27 02:49:08.358629 containerd[2012]: time="2025-05-27T02:49:08.357497975Z" level=info msg="CreateContainer within sandbox \"37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 02:49:08.381084 containerd[2012]: time="2025-05-27T02:49:08.377229359Z" level=info msg="Container eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:08.386335 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1463956705.mount: Deactivated successfully. May 27 02:49:08.394701 containerd[2012]: time="2025-05-27T02:49:08.394646868Z" level=info msg="CreateContainer within sandbox \"37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c\"" May 27 02:49:08.396807 containerd[2012]: time="2025-05-27T02:49:08.396061536Z" level=info msg="StartContainer for \"eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c\"" May 27 02:49:08.399583 containerd[2012]: time="2025-05-27T02:49:08.399410208Z" level=info msg="connecting to shim eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c" address="unix:///run/containerd/s/6e5fa9fcf01a1df88c226e0c1eb640b0088629019c6a5f33952c5fbf90da400b" protocol=ttrpc version=3 May 27 02:49:08.442078 systemd[1]: Started cri-containerd-eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c.scope - libcontainer container eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c. May 27 02:49:08.537837 containerd[2012]: time="2025-05-27T02:49:08.537587520Z" level=info msg="StartContainer for \"eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c\" returns successfully" May 27 02:49:09.522060 systemd[1]: cri-containerd-eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c.scope: Deactivated successfully. May 27 02:49:09.522634 systemd[1]: cri-containerd-eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c.scope: Consumed 916ms CPU time, 184.5M memory peak, 165.5M written to disk. May 27 02:49:09.527186 containerd[2012]: time="2025-05-27T02:49:09.526448005Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c\" id:\"eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c\" pid:4257 exited_at:{seconds:1748314149 nanos:524637733}" May 27 02:49:09.527186 containerd[2012]: time="2025-05-27T02:49:09.526614457Z" level=info msg="received exit event container_id:\"eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c\" id:\"eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c\" pid:4257 exited_at:{seconds:1748314149 nanos:524637733}" May 27 02:49:09.591214 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eaca45c06fc06eb8279afd816711ca7b6b0da021583abd25152cb8893889404c-rootfs.mount: Deactivated successfully. May 27 02:49:09.611088 kubelet[3281]: I0527 02:49:09.611019 3281 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 02:49:09.693086 kubelet[3281]: I0527 02:49:09.693001 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvhj\" (UniqueName: \"kubernetes.io/projected/0f05cece-e259-4298-8a92-3ea1b39f2107-kube-api-access-gnvhj\") pod \"coredns-668d6bf9bc-98wcf\" (UID: \"0f05cece-e259-4298-8a92-3ea1b39f2107\") " pod="kube-system/coredns-668d6bf9bc-98wcf" May 27 02:49:09.693337 kubelet[3281]: I0527 02:49:09.693092 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24c42b60-4e50-4df2-b057-56dcd0d2d906-tigera-ca-bundle\") pod \"calico-kube-controllers-6f5d8dffc5-jknpx\" (UID: \"24c42b60-4e50-4df2-b057-56dcd0d2d906\") " pod="calico-system/calico-kube-controllers-6f5d8dffc5-jknpx" May 27 02:49:09.693337 kubelet[3281]: I0527 02:49:09.693157 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf8t2\" (UniqueName: \"kubernetes.io/projected/24c42b60-4e50-4df2-b057-56dcd0d2d906-kube-api-access-gf8t2\") pod \"calico-kube-controllers-6f5d8dffc5-jknpx\" (UID: \"24c42b60-4e50-4df2-b057-56dcd0d2d906\") " pod="calico-system/calico-kube-controllers-6f5d8dffc5-jknpx" May 27 02:49:09.693337 kubelet[3281]: I0527 02:49:09.693207 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f05cece-e259-4298-8a92-3ea1b39f2107-config-volume\") pod \"coredns-668d6bf9bc-98wcf\" (UID: \"0f05cece-e259-4298-8a92-3ea1b39f2107\") " pod="kube-system/coredns-668d6bf9bc-98wcf" May 27 02:49:09.700843 systemd[1]: Created slice kubepods-burstable-pod0f05cece_e259_4298_8a92_3ea1b39f2107.slice - libcontainer container kubepods-burstable-pod0f05cece_e259_4298_8a92_3ea1b39f2107.slice. May 27 02:49:09.733129 systemd[1]: Created slice kubepods-besteffort-pod24c42b60_4e50_4df2_b057_56dcd0d2d906.slice - libcontainer container kubepods-besteffort-pod24c42b60_4e50_4df2_b057_56dcd0d2d906.slice. May 27 02:49:09.761078 systemd[1]: Created slice kubepods-besteffort-pod62b7cf93_c2e5_4b36_8a42_519d840f57ac.slice - libcontainer container kubepods-besteffort-pod62b7cf93_c2e5_4b36_8a42_519d840f57ac.slice. May 27 02:49:09.788051 systemd[1]: Created slice kubepods-besteffort-pod01f59b83_1b36_4034_ac04_b88b89b8506a.slice - libcontainer container kubepods-besteffort-pod01f59b83_1b36_4034_ac04_b88b89b8506a.slice. May 27 02:49:09.793631 kubelet[3281]: I0527 02:49:09.793572 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f5c6704c-4387-4f61-8657-91f8646bee6a-calico-apiserver-certs\") pod \"calico-apiserver-65948d5f-rqgjr\" (UID: \"f5c6704c-4387-4f61-8657-91f8646bee6a\") " pod="calico-apiserver/calico-apiserver-65948d5f-rqgjr" May 27 02:49:09.793960 kubelet[3281]: I0527 02:49:09.793666 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5drb\" (UniqueName: \"kubernetes.io/projected/01f59b83-1b36-4034-ac04-b88b89b8506a-kube-api-access-c5drb\") pod \"goldmane-78d55f7ddc-hkfqv\" (UID: \"01f59b83-1b36-4034-ac04-b88b89b8506a\") " pod="calico-system/goldmane-78d55f7ddc-hkfqv" May 27 02:49:09.793960 kubelet[3281]: I0527 02:49:09.793780 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/058da1c6-0f5e-41a7-b744-7dec794621a0-config-volume\") pod \"coredns-668d6bf9bc-jxm2b\" (UID: \"058da1c6-0f5e-41a7-b744-7dec794621a0\") " pod="kube-system/coredns-668d6bf9bc-jxm2b" May 27 02:49:09.793960 kubelet[3281]: I0527 02:49:09.793911 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj747\" (UniqueName: \"kubernetes.io/projected/832199df-a524-40b6-bb28-904e7566a390-kube-api-access-zj747\") pod \"calico-apiserver-65948d5f-rkrb7\" (UID: \"832199df-a524-40b6-bb28-904e7566a390\") " pod="calico-apiserver/calico-apiserver-65948d5f-rkrb7" May 27 02:49:09.794245 kubelet[3281]: I0527 02:49:09.793988 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-backend-key-pair\") pod \"whisker-7f79bdf7f6-84ngc\" (UID: \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\") " pod="calico-system/whisker-7f79bdf7f6-84ngc" May 27 02:49:09.794245 kubelet[3281]: I0527 02:49:09.794030 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dnf\" (UniqueName: \"kubernetes.io/projected/058da1c6-0f5e-41a7-b744-7dec794621a0-kube-api-access-27dnf\") pod \"coredns-668d6bf9bc-jxm2b\" (UID: \"058da1c6-0f5e-41a7-b744-7dec794621a0\") " pod="kube-system/coredns-668d6bf9bc-jxm2b" May 27 02:49:09.794245 kubelet[3281]: I0527 02:49:09.794068 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/832199df-a524-40b6-bb28-904e7566a390-calico-apiserver-certs\") pod \"calico-apiserver-65948d5f-rkrb7\" (UID: \"832199df-a524-40b6-bb28-904e7566a390\") " pod="calico-apiserver/calico-apiserver-65948d5f-rkrb7" May 27 02:49:09.794245 kubelet[3281]: I0527 02:49:09.794197 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-ca-bundle\") pod \"whisker-7f79bdf7f6-84ngc\" (UID: \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\") " pod="calico-system/whisker-7f79bdf7f6-84ngc" May 27 02:49:09.794245 kubelet[3281]: I0527 02:49:09.794238 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01f59b83-1b36-4034-ac04-b88b89b8506a-config\") pod \"goldmane-78d55f7ddc-hkfqv\" (UID: \"01f59b83-1b36-4034-ac04-b88b89b8506a\") " pod="calico-system/goldmane-78d55f7ddc-hkfqv" May 27 02:49:09.796625 kubelet[3281]: I0527 02:49:09.795155 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/01f59b83-1b36-4034-ac04-b88b89b8506a-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-hkfqv\" (UID: \"01f59b83-1b36-4034-ac04-b88b89b8506a\") " pod="calico-system/goldmane-78d55f7ddc-hkfqv" May 27 02:49:09.796625 kubelet[3281]: I0527 02:49:09.795220 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npv7t\" (UniqueName: \"kubernetes.io/projected/f5c6704c-4387-4f61-8657-91f8646bee6a-kube-api-access-npv7t\") pod \"calico-apiserver-65948d5f-rqgjr\" (UID: \"f5c6704c-4387-4f61-8657-91f8646bee6a\") " pod="calico-apiserver/calico-apiserver-65948d5f-rqgjr" May 27 02:49:09.796625 kubelet[3281]: I0527 02:49:09.795307 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85zz2\" (UniqueName: \"kubernetes.io/projected/62b7cf93-c2e5-4b36-8a42-519d840f57ac-kube-api-access-85zz2\") pod \"whisker-7f79bdf7f6-84ngc\" (UID: \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\") " pod="calico-system/whisker-7f79bdf7f6-84ngc" May 27 02:49:09.796625 kubelet[3281]: I0527 02:49:09.795347 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01f59b83-1b36-4034-ac04-b88b89b8506a-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-hkfqv\" (UID: \"01f59b83-1b36-4034-ac04-b88b89b8506a\") " pod="calico-system/goldmane-78d55f7ddc-hkfqv" May 27 02:49:09.819180 systemd[1]: Created slice kubepods-burstable-pod058da1c6_0f5e_41a7_b744_7dec794621a0.slice - libcontainer container kubepods-burstable-pod058da1c6_0f5e_41a7_b744_7dec794621a0.slice. May 27 02:49:09.886226 systemd[1]: Created slice kubepods-besteffort-pod832199df_a524_40b6_bb28_904e7566a390.slice - libcontainer container kubepods-besteffort-pod832199df_a524_40b6_bb28_904e7566a390.slice. May 27 02:49:09.909878 systemd[1]: Created slice kubepods-besteffort-podf5c6704c_4387_4f61_8657_91f8646bee6a.slice - libcontainer container kubepods-besteffort-podf5c6704c_4387_4f61_8657_91f8646bee6a.slice. May 27 02:49:10.024182 containerd[2012]: time="2025-05-27T02:49:10.024095556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-98wcf,Uid:0f05cece-e259-4298-8a92-3ea1b39f2107,Namespace:kube-system,Attempt:0,}" May 27 02:49:10.050024 containerd[2012]: time="2025-05-27T02:49:10.049888272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f5d8dffc5-jknpx,Uid:24c42b60-4e50-4df2-b057-56dcd0d2d906,Namespace:calico-system,Attempt:0,}" May 27 02:49:10.072247 containerd[2012]: time="2025-05-27T02:49:10.072105732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f79bdf7f6-84ngc,Uid:62b7cf93-c2e5-4b36-8a42-519d840f57ac,Namespace:calico-system,Attempt:0,}" May 27 02:49:10.110720 containerd[2012]: time="2025-05-27T02:49:10.110640156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hkfqv,Uid:01f59b83-1b36-4034-ac04-b88b89b8506a,Namespace:calico-system,Attempt:0,}" May 27 02:49:10.139366 containerd[2012]: time="2025-05-27T02:49:10.138971676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jxm2b,Uid:058da1c6-0f5e-41a7-b744-7dec794621a0,Namespace:kube-system,Attempt:0,}" May 27 02:49:10.198634 containerd[2012]: time="2025-05-27T02:49:10.198533736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rkrb7,Uid:832199df-a524-40b6-bb28-904e7566a390,Namespace:calico-apiserver,Attempt:0,}" May 27 02:49:10.258402 containerd[2012]: time="2025-05-27T02:49:10.258319741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rqgjr,Uid:f5c6704c-4387-4f61-8657-91f8646bee6a,Namespace:calico-apiserver,Attempt:0,}" May 27 02:49:10.272365 systemd[1]: Created slice kubepods-besteffort-pod62399f24_73bb_46f6_a534_e7c43ecd7271.slice - libcontainer container kubepods-besteffort-pod62399f24_73bb_46f6_a534_e7c43ecd7271.slice. May 27 02:49:10.280681 containerd[2012]: time="2025-05-27T02:49:10.280611133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dsg5,Uid:62399f24-73bb-46f6-a534-e7c43ecd7271,Namespace:calico-system,Attempt:0,}" May 27 02:49:10.554554 containerd[2012]: time="2025-05-27T02:49:10.553930946Z" level=error msg="Failed to destroy network for sandbox \"ec519177bf47511bb0302bdec4f41ebb2cd25a567e70df1002079340ab1384ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.565827 containerd[2012]: time="2025-05-27T02:49:10.564211394Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 02:49:10.583226 containerd[2012]: time="2025-05-27T02:49:10.583141226Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f79bdf7f6-84ngc,Uid:62b7cf93-c2e5-4b36-8a42-519d840f57ac,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec519177bf47511bb0302bdec4f41ebb2cd25a567e70df1002079340ab1384ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.593392 kubelet[3281]: E0527 02:49:10.593313 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec519177bf47511bb0302bdec4f41ebb2cd25a567e70df1002079340ab1384ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.593552 kubelet[3281]: E0527 02:49:10.593416 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec519177bf47511bb0302bdec4f41ebb2cd25a567e70df1002079340ab1384ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f79bdf7f6-84ngc" May 27 02:49:10.593552 kubelet[3281]: E0527 02:49:10.593451 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec519177bf47511bb0302bdec4f41ebb2cd25a567e70df1002079340ab1384ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f79bdf7f6-84ngc" May 27 02:49:10.593552 kubelet[3281]: E0527 02:49:10.593516 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7f79bdf7f6-84ngc_calico-system(62b7cf93-c2e5-4b36-8a42-519d840f57ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7f79bdf7f6-84ngc_calico-system(62b7cf93-c2e5-4b36-8a42-519d840f57ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ec519177bf47511bb0302bdec4f41ebb2cd25a567e70df1002079340ab1384ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7f79bdf7f6-84ngc" podUID="62b7cf93-c2e5-4b36-8a42-519d840f57ac" May 27 02:49:10.628903 containerd[2012]: time="2025-05-27T02:49:10.628813371Z" level=error msg="Failed to destroy network for sandbox \"0d558fc612eb77f6d0aead233b4ebbed8d11b7fe077f0a6a78cbec178cbdbe5e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.637909 containerd[2012]: time="2025-05-27T02:49:10.635196147Z" level=error msg="Failed to destroy network for sandbox \"313667ebd19040c1b850d97f1ef4fb63a177b51c8a18a5e79986abda292a789b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.641380 systemd[1]: run-netns-cni\x2d26793c3f\x2d2791\x2d6050\x2d998f\x2d9104e05f6927.mount: Deactivated successfully. May 27 02:49:10.641574 systemd[1]: run-netns-cni\x2d3bd46c74\x2d9ec0\x2d9431\x2d99e2\x2d883770aff9ac.mount: Deactivated successfully. May 27 02:49:10.644861 containerd[2012]: time="2025-05-27T02:49:10.643905819Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hkfqv,Uid:01f59b83-1b36-4034-ac04-b88b89b8506a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d558fc612eb77f6d0aead233b4ebbed8d11b7fe077f0a6a78cbec178cbdbe5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.647855 kubelet[3281]: E0527 02:49:10.646177 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d558fc612eb77f6d0aead233b4ebbed8d11b7fe077f0a6a78cbec178cbdbe5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.647855 kubelet[3281]: E0527 02:49:10.646253 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d558fc612eb77f6d0aead233b4ebbed8d11b7fe077f0a6a78cbec178cbdbe5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-hkfqv" May 27 02:49:10.647855 kubelet[3281]: E0527 02:49:10.646286 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d558fc612eb77f6d0aead233b4ebbed8d11b7fe077f0a6a78cbec178cbdbe5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-hkfqv" May 27 02:49:10.648533 kubelet[3281]: E0527 02:49:10.646366 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-hkfqv_calico-system(01f59b83-1b36-4034-ac04-b88b89b8506a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-hkfqv_calico-system(01f59b83-1b36-4034-ac04-b88b89b8506a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d558fc612eb77f6d0aead233b4ebbed8d11b7fe077f0a6a78cbec178cbdbe5e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:49:10.649087 containerd[2012]: time="2025-05-27T02:49:10.648855351Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-98wcf,Uid:0f05cece-e259-4298-8a92-3ea1b39f2107,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"313667ebd19040c1b850d97f1ef4fb63a177b51c8a18a5e79986abda292a789b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.652683 kubelet[3281]: E0527 02:49:10.652205 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"313667ebd19040c1b850d97f1ef4fb63a177b51c8a18a5e79986abda292a789b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.652683 kubelet[3281]: E0527 02:49:10.652307 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"313667ebd19040c1b850d97f1ef4fb63a177b51c8a18a5e79986abda292a789b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-98wcf" May 27 02:49:10.652683 kubelet[3281]: E0527 02:49:10.652339 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"313667ebd19040c1b850d97f1ef4fb63a177b51c8a18a5e79986abda292a789b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-98wcf" May 27 02:49:10.652998 kubelet[3281]: E0527 02:49:10.652398 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-98wcf_kube-system(0f05cece-e259-4298-8a92-3ea1b39f2107)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-98wcf_kube-system(0f05cece-e259-4298-8a92-3ea1b39f2107)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"313667ebd19040c1b850d97f1ef4fb63a177b51c8a18a5e79986abda292a789b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-98wcf" podUID="0f05cece-e259-4298-8a92-3ea1b39f2107" May 27 02:49:10.661468 containerd[2012]: time="2025-05-27T02:49:10.661387107Z" level=error msg="Failed to destroy network for sandbox \"f100cd412b6dd0fd0cfdb690a883f5a2ef3fad31ceb11b4bbaa23ac45e8e08e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.667707 systemd[1]: run-netns-cni\x2d00b527be\x2d0b1f\x2d26d4\x2df1cf\x2d37a2bc9ac935.mount: Deactivated successfully. May 27 02:49:10.671612 containerd[2012]: time="2025-05-27T02:49:10.670714887Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jxm2b,Uid:058da1c6-0f5e-41a7-b744-7dec794621a0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f100cd412b6dd0fd0cfdb690a883f5a2ef3fad31ceb11b4bbaa23ac45e8e08e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.672886 kubelet[3281]: E0527 02:49:10.672568 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f100cd412b6dd0fd0cfdb690a883f5a2ef3fad31ceb11b4bbaa23ac45e8e08e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.673448 kubelet[3281]: E0527 02:49:10.673002 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f100cd412b6dd0fd0cfdb690a883f5a2ef3fad31ceb11b4bbaa23ac45e8e08e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jxm2b" May 27 02:49:10.673448 kubelet[3281]: E0527 02:49:10.673076 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f100cd412b6dd0fd0cfdb690a883f5a2ef3fad31ceb11b4bbaa23ac45e8e08e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jxm2b" May 27 02:49:10.673448 kubelet[3281]: E0527 02:49:10.673377 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-jxm2b_kube-system(058da1c6-0f5e-41a7-b744-7dec794621a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-jxm2b_kube-system(058da1c6-0f5e-41a7-b744-7dec794621a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f100cd412b6dd0fd0cfdb690a883f5a2ef3fad31ceb11b4bbaa23ac45e8e08e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-jxm2b" podUID="058da1c6-0f5e-41a7-b744-7dec794621a0" May 27 02:49:10.690812 containerd[2012]: time="2025-05-27T02:49:10.690675783Z" level=error msg="Failed to destroy network for sandbox \"3bb59066c0cee5fd7e0a89701eb9cc1c099e15d101f8026a2b61c87b86c924f2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.695827 containerd[2012]: time="2025-05-27T02:49:10.695255799Z" level=error msg="Failed to destroy network for sandbox \"bebca548edfb6eb6613bb0c30a9497641fe8d86fd06e122462e5b7ba4c888af5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.698358 systemd[1]: run-netns-cni\x2dc2ce62b3\x2d29bc\x2d272d\x2d3bad\x2d68052b61163e.mount: Deactivated successfully. May 27 02:49:10.706077 containerd[2012]: time="2025-05-27T02:49:10.703746423Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f5d8dffc5-jknpx,Uid:24c42b60-4e50-4df2-b057-56dcd0d2d906,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bb59066c0cee5fd7e0a89701eb9cc1c099e15d101f8026a2b61c87b86c924f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.706630 kubelet[3281]: E0527 02:49:10.706497 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bb59066c0cee5fd7e0a89701eb9cc1c099e15d101f8026a2b61c87b86c924f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.706630 kubelet[3281]: E0527 02:49:10.706581 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bb59066c0cee5fd7e0a89701eb9cc1c099e15d101f8026a2b61c87b86c924f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f5d8dffc5-jknpx" May 27 02:49:10.706630 kubelet[3281]: E0527 02:49:10.706618 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bb59066c0cee5fd7e0a89701eb9cc1c099e15d101f8026a2b61c87b86c924f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f5d8dffc5-jknpx" May 27 02:49:10.706891 kubelet[3281]: E0527 02:49:10.706687 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6f5d8dffc5-jknpx_calico-system(24c42b60-4e50-4df2-b057-56dcd0d2d906)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6f5d8dffc5-jknpx_calico-system(24c42b60-4e50-4df2-b057-56dcd0d2d906)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3bb59066c0cee5fd7e0a89701eb9cc1c099e15d101f8026a2b61c87b86c924f2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6f5d8dffc5-jknpx" podUID="24c42b60-4e50-4df2-b057-56dcd0d2d906" May 27 02:49:10.707476 systemd[1]: run-netns-cni\x2d7dba87cf\x2d9fc8\x2da88c\x2daa2a\x2dc68fcab62fd0.mount: Deactivated successfully. May 27 02:49:10.710839 containerd[2012]: time="2025-05-27T02:49:10.709242807Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rkrb7,Uid:832199df-a524-40b6-bb28-904e7566a390,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bebca548edfb6eb6613bb0c30a9497641fe8d86fd06e122462e5b7ba4c888af5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.712097 kubelet[3281]: E0527 02:49:10.711825 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bebca548edfb6eb6613bb0c30a9497641fe8d86fd06e122462e5b7ba4c888af5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.712097 kubelet[3281]: E0527 02:49:10.711914 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bebca548edfb6eb6613bb0c30a9497641fe8d86fd06e122462e5b7ba4c888af5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65948d5f-rkrb7" May 27 02:49:10.712097 kubelet[3281]: E0527 02:49:10.711958 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bebca548edfb6eb6613bb0c30a9497641fe8d86fd06e122462e5b7ba4c888af5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65948d5f-rkrb7" May 27 02:49:10.712337 kubelet[3281]: E0527 02:49:10.712043 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-65948d5f-rkrb7_calico-apiserver(832199df-a524-40b6-bb28-904e7566a390)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-65948d5f-rkrb7_calico-apiserver(832199df-a524-40b6-bb28-904e7566a390)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bebca548edfb6eb6613bb0c30a9497641fe8d86fd06e122462e5b7ba4c888af5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-65948d5f-rkrb7" podUID="832199df-a524-40b6-bb28-904e7566a390" May 27 02:49:10.738899 containerd[2012]: time="2025-05-27T02:49:10.738819111Z" level=error msg="Failed to destroy network for sandbox \"a9dd763088ddbbc6c2c8f4dd988ff8ec2db39e446d4bb1e8e80ff13b1c903152\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.741897 containerd[2012]: time="2025-05-27T02:49:10.741814635Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dsg5,Uid:62399f24-73bb-46f6-a534-e7c43ecd7271,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9dd763088ddbbc6c2c8f4dd988ff8ec2db39e446d4bb1e8e80ff13b1c903152\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.742259 kubelet[3281]: E0527 02:49:10.742121 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9dd763088ddbbc6c2c8f4dd988ff8ec2db39e446d4bb1e8e80ff13b1c903152\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.742259 kubelet[3281]: E0527 02:49:10.742195 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9dd763088ddbbc6c2c8f4dd988ff8ec2db39e446d4bb1e8e80ff13b1c903152\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9dsg5" May 27 02:49:10.742380 kubelet[3281]: E0527 02:49:10.742233 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9dd763088ddbbc6c2c8f4dd988ff8ec2db39e446d4bb1e8e80ff13b1c903152\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9dsg5" May 27 02:49:10.742747 kubelet[3281]: E0527 02:49:10.742391 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9dsg5_calico-system(62399f24-73bb-46f6-a534-e7c43ecd7271)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9dsg5_calico-system(62399f24-73bb-46f6-a534-e7c43ecd7271)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9dd763088ddbbc6c2c8f4dd988ff8ec2db39e446d4bb1e8e80ff13b1c903152\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9dsg5" podUID="62399f24-73bb-46f6-a534-e7c43ecd7271" May 27 02:49:10.744826 containerd[2012]: time="2025-05-27T02:49:10.744162255Z" level=error msg="Failed to destroy network for sandbox \"49b72d2c2402f360ebda9d3ce8a61ed2eadab4c256789e6f9e43ea99820a41c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.747052 containerd[2012]: time="2025-05-27T02:49:10.746934303Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rqgjr,Uid:f5c6704c-4387-4f61-8657-91f8646bee6a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"49b72d2c2402f360ebda9d3ce8a61ed2eadab4c256789e6f9e43ea99820a41c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.747994 kubelet[3281]: E0527 02:49:10.747921 3281 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49b72d2c2402f360ebda9d3ce8a61ed2eadab4c256789e6f9e43ea99820a41c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:49:10.748125 kubelet[3281]: E0527 02:49:10.748007 3281 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49b72d2c2402f360ebda9d3ce8a61ed2eadab4c256789e6f9e43ea99820a41c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65948d5f-rqgjr" May 27 02:49:10.748125 kubelet[3281]: E0527 02:49:10.748051 3281 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49b72d2c2402f360ebda9d3ce8a61ed2eadab4c256789e6f9e43ea99820a41c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-65948d5f-rqgjr" May 27 02:49:10.748241 kubelet[3281]: E0527 02:49:10.748112 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-65948d5f-rqgjr_calico-apiserver(f5c6704c-4387-4f61-8657-91f8646bee6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-65948d5f-rqgjr_calico-apiserver(f5c6704c-4387-4f61-8657-91f8646bee6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"49b72d2c2402f360ebda9d3ce8a61ed2eadab4c256789e6f9e43ea99820a41c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-65948d5f-rqgjr" podUID="f5c6704c-4387-4f61-8657-91f8646bee6a" May 27 02:49:11.590288 systemd[1]: run-netns-cni\x2da006c779\x2deb61\x2d7141\x2d1122\x2deee13c42548d.mount: Deactivated successfully. May 27 02:49:11.590644 systemd[1]: run-netns-cni\x2de8d5baf4\x2d6b93\x2d0fed\x2de543\x2d59f684c2539f.mount: Deactivated successfully. May 27 02:49:16.602150 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2612727086.mount: Deactivated successfully. May 27 02:49:16.659823 containerd[2012]: time="2025-05-27T02:49:16.659448849Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:16.661862 containerd[2012]: time="2025-05-27T02:49:16.661791501Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=150465379" May 27 02:49:16.664164 containerd[2012]: time="2025-05-27T02:49:16.664054953Z" level=info msg="ImageCreate event name:\"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:16.668908 containerd[2012]: time="2025-05-27T02:49:16.668810097Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:16.670245 containerd[2012]: time="2025-05-27T02:49:16.669981393Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"150465241\" in 6.105666391s" May 27 02:49:16.670245 containerd[2012]: time="2025-05-27T02:49:16.670036161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\"" May 27 02:49:16.705000 containerd[2012]: time="2025-05-27T02:49:16.704118249Z" level=info msg="CreateContainer within sandbox \"37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 02:49:16.723372 containerd[2012]: time="2025-05-27T02:49:16.723298101Z" level=info msg="Container f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:16.733715 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2709072755.mount: Deactivated successfully. May 27 02:49:16.764912 containerd[2012]: time="2025-05-27T02:49:16.764848809Z" level=info msg="CreateContainer within sandbox \"37e3d6b408b7f493b855091b4990651afc1e1fe6dbbe5bdb124128155a56dc8f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\"" May 27 02:49:16.770386 containerd[2012]: time="2025-05-27T02:49:16.770310177Z" level=info msg="StartContainer for \"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\"" May 27 02:49:16.776379 containerd[2012]: time="2025-05-27T02:49:16.776174073Z" level=info msg="connecting to shim f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e" address="unix:///run/containerd/s/6e5fa9fcf01a1df88c226e0c1eb640b0088629019c6a5f33952c5fbf90da400b" protocol=ttrpc version=3 May 27 02:49:16.811072 systemd[1]: Started cri-containerd-f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e.scope - libcontainer container f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e. May 27 02:49:16.898577 containerd[2012]: time="2025-05-27T02:49:16.898342030Z" level=info msg="StartContainer for \"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\" returns successfully" May 27 02:49:17.043832 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 02:49:17.043973 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 02:49:17.260146 kubelet[3281]: I0527 02:49:17.260083 3281 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85zz2\" (UniqueName: \"kubernetes.io/projected/62b7cf93-c2e5-4b36-8a42-519d840f57ac-kube-api-access-85zz2\") pod \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\" (UID: \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\") " May 27 02:49:17.260724 kubelet[3281]: I0527 02:49:17.260157 3281 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-ca-bundle\") pod \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\" (UID: \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\") " May 27 02:49:17.260724 kubelet[3281]: I0527 02:49:17.260217 3281 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-backend-key-pair\") pod \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\" (UID: \"62b7cf93-c2e5-4b36-8a42-519d840f57ac\") " May 27 02:49:17.266223 kubelet[3281]: I0527 02:49:17.265701 3281 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "62b7cf93-c2e5-4b36-8a42-519d840f57ac" (UID: "62b7cf93-c2e5-4b36-8a42-519d840f57ac"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 02:49:17.281890 kubelet[3281]: I0527 02:49:17.281790 3281 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b7cf93-c2e5-4b36-8a42-519d840f57ac-kube-api-access-85zz2" (OuterVolumeSpecName: "kube-api-access-85zz2") pod "62b7cf93-c2e5-4b36-8a42-519d840f57ac" (UID: "62b7cf93-c2e5-4b36-8a42-519d840f57ac"). InnerVolumeSpecName "kube-api-access-85zz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 02:49:17.283660 kubelet[3281]: I0527 02:49:17.283589 3281 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "62b7cf93-c2e5-4b36-8a42-519d840f57ac" (UID: "62b7cf93-c2e5-4b36-8a42-519d840f57ac"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 02:49:17.360744 kubelet[3281]: I0527 02:49:17.360631 3281 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-85zz2\" (UniqueName: \"kubernetes.io/projected/62b7cf93-c2e5-4b36-8a42-519d840f57ac-kube-api-access-85zz2\") on node \"ip-172-31-28-205\" DevicePath \"\"" May 27 02:49:17.360744 kubelet[3281]: I0527 02:49:17.360684 3281 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-ca-bundle\") on node \"ip-172-31-28-205\" DevicePath \"\"" May 27 02:49:17.360744 kubelet[3281]: I0527 02:49:17.360707 3281 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/62b7cf93-c2e5-4b36-8a42-519d840f57ac-whisker-backend-key-pair\") on node \"ip-172-31-28-205\" DevicePath \"\"" May 27 02:49:17.615508 systemd[1]: var-lib-kubelet-pods-62b7cf93\x2dc2e5\x2d4b36\x2d8a42\x2d519d840f57ac-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d85zz2.mount: Deactivated successfully. May 27 02:49:17.615715 systemd[1]: var-lib-kubelet-pods-62b7cf93\x2dc2e5\x2d4b36\x2d8a42\x2d519d840f57ac-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 02:49:17.623933 kubelet[3281]: I0527 02:49:17.623846 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-pfjmw" podStartSLOduration=1.786252054 podStartE2EDuration="17.623751681s" podCreationTimestamp="2025-05-27 02:49:00 +0000 UTC" firstStartedPulling="2025-05-27 02:49:00.833979042 +0000 UTC m=+25.898851306" lastFinishedPulling="2025-05-27 02:49:16.671478669 +0000 UTC m=+41.736350933" observedRunningTime="2025-05-27 02:49:17.619987101 +0000 UTC m=+42.684859377" watchObservedRunningTime="2025-05-27 02:49:17.623751681 +0000 UTC m=+42.688624053" May 27 02:49:17.632507 systemd[1]: Removed slice kubepods-besteffort-pod62b7cf93_c2e5_4b36_8a42_519d840f57ac.slice - libcontainer container kubepods-besteffort-pod62b7cf93_c2e5_4b36_8a42_519d840f57ac.slice. May 27 02:49:17.752963 systemd[1]: Created slice kubepods-besteffort-pode1a2f19c_cbfe_493b_984d_c722792ee820.slice - libcontainer container kubepods-besteffort-pode1a2f19c_cbfe_493b_984d_c722792ee820.slice. May 27 02:49:17.764755 kubelet[3281]: I0527 02:49:17.764635 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1a2f19c-cbfe-493b-984d-c722792ee820-whisker-ca-bundle\") pod \"whisker-6888687c66-w5ggn\" (UID: \"e1a2f19c-cbfe-493b-984d-c722792ee820\") " pod="calico-system/whisker-6888687c66-w5ggn" May 27 02:49:17.764755 kubelet[3281]: I0527 02:49:17.764715 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v5h6\" (UniqueName: \"kubernetes.io/projected/e1a2f19c-cbfe-493b-984d-c722792ee820-kube-api-access-4v5h6\") pod \"whisker-6888687c66-w5ggn\" (UID: \"e1a2f19c-cbfe-493b-984d-c722792ee820\") " pod="calico-system/whisker-6888687c66-w5ggn" May 27 02:49:17.764755 kubelet[3281]: I0527 02:49:17.764758 3281 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e1a2f19c-cbfe-493b-984d-c722792ee820-whisker-backend-key-pair\") pod \"whisker-6888687c66-w5ggn\" (UID: \"e1a2f19c-cbfe-493b-984d-c722792ee820\") " pod="calico-system/whisker-6888687c66-w5ggn" May 27 02:49:18.063143 containerd[2012]: time="2025-05-27T02:49:18.063054152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6888687c66-w5ggn,Uid:e1a2f19c-cbfe-493b-984d-c722792ee820,Namespace:calico-system,Attempt:0,}" May 27 02:49:18.324449 (udev-worker)[4550]: Network interface NamePolicy= disabled on kernel command line. May 27 02:49:18.328319 systemd-networkd[1823]: calibf10e1075d1: Link UP May 27 02:49:18.329689 systemd-networkd[1823]: calibf10e1075d1: Gained carrier May 27 02:49:18.363715 containerd[2012]: 2025-05-27 02:49:18.109 [INFO][4578] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 02:49:18.363715 containerd[2012]: 2025-05-27 02:49:18.190 [INFO][4578] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0 whisker-6888687c66- calico-system e1a2f19c-cbfe-493b-984d-c722792ee820 876 0 2025-05-27 02:49:17 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6888687c66 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-28-205 whisker-6888687c66-w5ggn eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calibf10e1075d1 [] [] }} ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-" May 27 02:49:18.363715 containerd[2012]: 2025-05-27 02:49:18.190 [INFO][4578] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" May 27 02:49:18.363715 containerd[2012]: 2025-05-27 02:49:18.239 [INFO][4590] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" HandleID="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Workload="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.239 [INFO][4590] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" HandleID="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Workload="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000231780), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-28-205", "pod":"whisker-6888687c66-w5ggn", "timestamp":"2025-05-27 02:49:18.239227352 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.239 [INFO][4590] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.239 [INFO][4590] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.239 [INFO][4590] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.253 [INFO][4590] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" host="ip-172-31-28-205" May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.260 [INFO][4590] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.267 [INFO][4590] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.274 [INFO][4590] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:18.364116 containerd[2012]: 2025-05-27 02:49:18.279 [INFO][4590] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:18.365822 containerd[2012]: 2025-05-27 02:49:18.280 [INFO][4590] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" host="ip-172-31-28-205" May 27 02:49:18.365822 containerd[2012]: 2025-05-27 02:49:18.282 [INFO][4590] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3 May 27 02:49:18.365822 containerd[2012]: 2025-05-27 02:49:18.292 [INFO][4590] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" host="ip-172-31-28-205" May 27 02:49:18.365822 containerd[2012]: 2025-05-27 02:49:18.303 [INFO][4590] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.65/26] block=192.168.22.64/26 handle="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" host="ip-172-31-28-205" May 27 02:49:18.365822 containerd[2012]: 2025-05-27 02:49:18.303 [INFO][4590] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.65/26] handle="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" host="ip-172-31-28-205" May 27 02:49:18.365822 containerd[2012]: 2025-05-27 02:49:18.303 [INFO][4590] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:18.365822 containerd[2012]: 2025-05-27 02:49:18.303 [INFO][4590] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.65/26] IPv6=[] ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" HandleID="k8s-pod-network.cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Workload="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" May 27 02:49:18.368620 containerd[2012]: 2025-05-27 02:49:18.310 [INFO][4578] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0", GenerateName:"whisker-6888687c66-", Namespace:"calico-system", SelfLink:"", UID:"e1a2f19c-cbfe-493b-984d-c722792ee820", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6888687c66", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"whisker-6888687c66-w5ggn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.22.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibf10e1075d1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:18.368620 containerd[2012]: 2025-05-27 02:49:18.311 [INFO][4578] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.65/32] ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" May 27 02:49:18.370510 containerd[2012]: 2025-05-27 02:49:18.311 [INFO][4578] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibf10e1075d1 ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" May 27 02:49:18.370510 containerd[2012]: 2025-05-27 02:49:18.329 [INFO][4578] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" May 27 02:49:18.370634 containerd[2012]: 2025-05-27 02:49:18.330 [INFO][4578] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0", GenerateName:"whisker-6888687c66-", Namespace:"calico-system", SelfLink:"", UID:"e1a2f19c-cbfe-493b-984d-c722792ee820", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6888687c66", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3", Pod:"whisker-6888687c66-w5ggn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.22.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibf10e1075d1", MAC:"d6:e4:b8:e7:31:37", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:18.370812 containerd[2012]: 2025-05-27 02:49:18.359 [INFO][4578] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" Namespace="calico-system" Pod="whisker-6888687c66-w5ggn" WorkloadEndpoint="ip--172--31--28--205-k8s-whisker--6888687c66--w5ggn-eth0" May 27 02:49:18.415920 containerd[2012]: time="2025-05-27T02:49:18.415846113Z" level=info msg="connecting to shim cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3" address="unix:///run/containerd/s/bb08842de5aa4818dc96ddbff1b0238503e6454eaecb6d8d9229b1e4dc105a34" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:18.454072 systemd[1]: Started cri-containerd-cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3.scope - libcontainer container cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3. May 27 02:49:18.530266 containerd[2012]: time="2025-05-27T02:49:18.530166742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6888687c66-w5ggn,Uid:e1a2f19c-cbfe-493b-984d-c722792ee820,Namespace:calico-system,Attempt:0,} returns sandbox id \"cb90ceee2c9778495acb57ad4600d5de28cfe8b45baed7eb2c9b212306272bc3\"" May 27 02:49:18.534898 containerd[2012]: time="2025-05-27T02:49:18.533489386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:49:18.711693 containerd[2012]: time="2025-05-27T02:49:18.710488043Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:18.713130 containerd[2012]: time="2025-05-27T02:49:18.713010527Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:18.713924 containerd[2012]: time="2025-05-27T02:49:18.713057375Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:49:18.714164 kubelet[3281]: E0527 02:49:18.714101 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:18.714726 kubelet[3281]: E0527 02:49:18.714194 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:18.723310 kubelet[3281]: E0527 02:49:18.723217 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1327877524034c62a2710c5fbd6feb5f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:18.727251 containerd[2012]: time="2025-05-27T02:49:18.726917399Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:49:18.933696 containerd[2012]: time="2025-05-27T02:49:18.933636696Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:18.940055 containerd[2012]: time="2025-05-27T02:49:18.939932664Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:18.940459 containerd[2012]: time="2025-05-27T02:49:18.939990552Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:49:18.940722 kubelet[3281]: E0527 02:49:18.940655 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:18.940889 kubelet[3281]: E0527 02:49:18.940732 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:18.944470 kubelet[3281]: E0527 02:49:18.942161 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:18.945098 kubelet[3281]: E0527 02:49:18.944983 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:49:19.028443 containerd[2012]: time="2025-05-27T02:49:19.028383200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\" id:\"636c29066421e6eded4d84d800d4544bec0046b4f35966541aa00215aadd53f2\" pid:4663 exit_status:1 exited_at:{seconds:1748314159 nanos:27018608}" May 27 02:49:19.265984 kubelet[3281]: I0527 02:49:19.265496 3281 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b7cf93-c2e5-4b36-8a42-519d840f57ac" path="/var/lib/kubelet/pods/62b7cf93-c2e5-4b36-8a42-519d840f57ac/volumes" May 27 02:49:19.601155 kubelet[3281]: E0527 02:49:19.601063 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:49:19.711490 systemd-networkd[1823]: calibf10e1075d1: Gained IPv6LL May 27 02:49:19.799444 containerd[2012]: time="2025-05-27T02:49:19.799381056Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\" id:\"f83d685e8a2ece5f25e17fc25e3bc30116603d80d84f7863992e521500c3f089\" pid:4812 exit_status:1 exited_at:{seconds:1748314159 nanos:798594060}" May 27 02:49:19.909324 systemd-networkd[1823]: vxlan.calico: Link UP May 27 02:49:19.909339 systemd-networkd[1823]: vxlan.calico: Gained carrier May 27 02:49:19.960984 (udev-worker)[4552]: Network interface NamePolicy= disabled on kernel command line. May 27 02:49:21.823644 systemd-networkd[1823]: vxlan.calico: Gained IPv6LL May 27 02:49:22.256539 containerd[2012]: time="2025-05-27T02:49:22.256049460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rqgjr,Uid:f5c6704c-4387-4f61-8657-91f8646bee6a,Namespace:calico-apiserver,Attempt:0,}" May 27 02:49:22.256539 containerd[2012]: time="2025-05-27T02:49:22.256104996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dsg5,Uid:62399f24-73bb-46f6-a534-e7c43ecd7271,Namespace:calico-system,Attempt:0,}" May 27 02:49:22.256539 containerd[2012]: time="2025-05-27T02:49:22.256292688Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-98wcf,Uid:0f05cece-e259-4298-8a92-3ea1b39f2107,Namespace:kube-system,Attempt:0,}" May 27 02:49:22.257328 containerd[2012]: time="2025-05-27T02:49:22.257243196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rkrb7,Uid:832199df-a524-40b6-bb28-904e7566a390,Namespace:calico-apiserver,Attempt:0,}" May 27 02:49:22.257590 containerd[2012]: time="2025-05-27T02:49:22.257542932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f5d8dffc5-jknpx,Uid:24c42b60-4e50-4df2-b057-56dcd0d2d906,Namespace:calico-system,Attempt:0,}" May 27 02:49:22.759086 (udev-worker)[4860]: Network interface NamePolicy= disabled on kernel command line. May 27 02:49:22.762674 systemd-networkd[1823]: cali9c06e7da54c: Link UP May 27 02:49:22.766195 systemd-networkd[1823]: cali9c06e7da54c: Gained carrier May 27 02:49:22.802414 containerd[2012]: 2025-05-27 02:49:22.481 [INFO][4915] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0 calico-apiserver-65948d5f- calico-apiserver 832199df-a524-40b6-bb28-904e7566a390 812 0 2025-05-27 02:48:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:65948d5f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-28-205 calico-apiserver-65948d5f-rkrb7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9c06e7da54c [] [] }} ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-" May 27 02:49:22.802414 containerd[2012]: 2025-05-27 02:49:22.486 [INFO][4915] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" May 27 02:49:22.802414 containerd[2012]: 2025-05-27 02:49:22.646 [INFO][4960] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" HandleID="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Workload="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.647 [INFO][4960] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" HandleID="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Workload="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000367a90), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-28-205", "pod":"calico-apiserver-65948d5f-rkrb7", "timestamp":"2025-05-27 02:49:22.646322762 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.647 [INFO][4960] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.647 [INFO][4960] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.648 [INFO][4960] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.682 [INFO][4960] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" host="ip-172-31-28-205" May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.697 [INFO][4960] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.710 [INFO][4960] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.718 [INFO][4960] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:22.802844 containerd[2012]: 2025-05-27 02:49:22.725 [INFO][4960] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:22.803317 containerd[2012]: 2025-05-27 02:49:22.725 [INFO][4960] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" host="ip-172-31-28-205" May 27 02:49:22.803317 containerd[2012]: 2025-05-27 02:49:22.729 [INFO][4960] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d May 27 02:49:22.803317 containerd[2012]: 2025-05-27 02:49:22.737 [INFO][4960] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" host="ip-172-31-28-205" May 27 02:49:22.803317 containerd[2012]: 2025-05-27 02:49:22.747 [INFO][4960] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.66/26] block=192.168.22.64/26 handle="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" host="ip-172-31-28-205" May 27 02:49:22.803317 containerd[2012]: 2025-05-27 02:49:22.748 [INFO][4960] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.66/26] handle="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" host="ip-172-31-28-205" May 27 02:49:22.803317 containerd[2012]: 2025-05-27 02:49:22.748 [INFO][4960] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:22.803317 containerd[2012]: 2025-05-27 02:49:22.748 [INFO][4960] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.66/26] IPv6=[] ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" HandleID="k8s-pod-network.772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Workload="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" May 27 02:49:22.803638 containerd[2012]: 2025-05-27 02:49:22.752 [INFO][4915] cni-plugin/k8s.go 418: Populated endpoint ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0", GenerateName:"calico-apiserver-65948d5f-", Namespace:"calico-apiserver", SelfLink:"", UID:"832199df-a524-40b6-bb28-904e7566a390", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65948d5f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"calico-apiserver-65948d5f-rkrb7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.22.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9c06e7da54c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:22.804431 containerd[2012]: 2025-05-27 02:49:22.752 [INFO][4915] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.66/32] ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" May 27 02:49:22.804431 containerd[2012]: 2025-05-27 02:49:22.752 [INFO][4915] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c06e7da54c ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" May 27 02:49:22.804431 containerd[2012]: 2025-05-27 02:49:22.765 [INFO][4915] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" May 27 02:49:22.804649 containerd[2012]: 2025-05-27 02:49:22.772 [INFO][4915] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0", GenerateName:"calico-apiserver-65948d5f-", Namespace:"calico-apiserver", SelfLink:"", UID:"832199df-a524-40b6-bb28-904e7566a390", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65948d5f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d", Pod:"calico-apiserver-65948d5f-rkrb7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.22.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9c06e7da54c", MAC:"fa:75:f9:bd:87:c1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:22.805176 containerd[2012]: 2025-05-27 02:49:22.793 [INFO][4915] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rkrb7" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rkrb7-eth0" May 27 02:49:22.890968 containerd[2012]: time="2025-05-27T02:49:22.890862136Z" level=info msg="connecting to shim 772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d" address="unix:///run/containerd/s/b71a008631b1bf1423467c36227506db1e7e2e68ca78ac6818d78aa06194170a" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:22.904256 systemd-networkd[1823]: cali87992d1834a: Link UP May 27 02:49:22.909300 systemd-networkd[1823]: cali87992d1834a: Gained carrier May 27 02:49:22.961451 containerd[2012]: 2025-05-27 02:49:22.488 [INFO][4901] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0 csi-node-driver- calico-system 62399f24-73bb-46f6-a534-e7c43ecd7271 671 0 2025-05-27 02:49:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-28-205 csi-node-driver-9dsg5 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali87992d1834a [] [] }} ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-" May 27 02:49:22.961451 containerd[2012]: 2025-05-27 02:49:22.489 [INFO][4901] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" May 27 02:49:22.961451 containerd[2012]: 2025-05-27 02:49:22.651 [INFO][4958] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" HandleID="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Workload="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.651 [INFO][4958] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" HandleID="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Workload="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d60e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-28-205", "pod":"csi-node-driver-9dsg5", "timestamp":"2025-05-27 02:49:22.651052478 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.652 [INFO][4958] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.748 [INFO][4958] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.749 [INFO][4958] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.782 [INFO][4958] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" host="ip-172-31-28-205" May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.800 [INFO][4958] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.815 [INFO][4958] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.821 [INFO][4958] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:22.961824 containerd[2012]: 2025-05-27 02:49:22.834 [INFO][4958] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:22.962319 containerd[2012]: 2025-05-27 02:49:22.834 [INFO][4958] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" host="ip-172-31-28-205" May 27 02:49:22.962319 containerd[2012]: 2025-05-27 02:49:22.839 [INFO][4958] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b May 27 02:49:22.962319 containerd[2012]: 2025-05-27 02:49:22.849 [INFO][4958] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" host="ip-172-31-28-205" May 27 02:49:22.962319 containerd[2012]: 2025-05-27 02:49:22.870 [INFO][4958] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.67/26] block=192.168.22.64/26 handle="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" host="ip-172-31-28-205" May 27 02:49:22.962319 containerd[2012]: 2025-05-27 02:49:22.871 [INFO][4958] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.67/26] handle="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" host="ip-172-31-28-205" May 27 02:49:22.962319 containerd[2012]: 2025-05-27 02:49:22.872 [INFO][4958] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:22.962319 containerd[2012]: 2025-05-27 02:49:22.873 [INFO][4958] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.67/26] IPv6=[] ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" HandleID="k8s-pod-network.ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Workload="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" May 27 02:49:22.965733 containerd[2012]: 2025-05-27 02:49:22.885 [INFO][4901] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"62399f24-73bb-46f6-a534-e7c43ecd7271", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"csi-node-driver-9dsg5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.22.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali87992d1834a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:22.965931 containerd[2012]: 2025-05-27 02:49:22.886 [INFO][4901] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.67/32] ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" May 27 02:49:22.965931 containerd[2012]: 2025-05-27 02:49:22.887 [INFO][4901] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali87992d1834a ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" May 27 02:49:22.965931 containerd[2012]: 2025-05-27 02:49:22.911 [INFO][4901] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" May 27 02:49:22.966503 containerd[2012]: 2025-05-27 02:49:22.922 [INFO][4901] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"62399f24-73bb-46f6-a534-e7c43ecd7271", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b", Pod:"csi-node-driver-9dsg5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.22.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali87992d1834a", MAC:"ae:77:93:91:d8:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:22.966636 containerd[2012]: 2025-05-27 02:49:22.952 [INFO][4901] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" Namespace="calico-system" Pod="csi-node-driver-9dsg5" WorkloadEndpoint="ip--172--31--28--205-k8s-csi--node--driver--9dsg5-eth0" May 27 02:49:23.004094 systemd[1]: Started cri-containerd-772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d.scope - libcontainer container 772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d. May 27 02:49:23.053521 systemd-networkd[1823]: calid062417242e: Link UP May 27 02:49:23.059553 systemd-networkd[1823]: calid062417242e: Gained carrier May 27 02:49:23.093803 containerd[2012]: time="2025-05-27T02:49:23.092358133Z" level=info msg="connecting to shim ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b" address="unix:///run/containerd/s/0e607d18011b71843e90d0d89f8ade1b5e99e0fa3829b7c5345474caab41cf68" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:23.121877 containerd[2012]: 2025-05-27 02:49:22.570 [INFO][4907] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0 coredns-668d6bf9bc- kube-system 0f05cece-e259-4298-8a92-3ea1b39f2107 802 0 2025-05-27 02:48:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-28-205 coredns-668d6bf9bc-98wcf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid062417242e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-" May 27 02:49:23.121877 containerd[2012]: 2025-05-27 02:49:22.570 [INFO][4907] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" May 27 02:49:23.121877 containerd[2012]: 2025-05-27 02:49:22.700 [INFO][4970] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" HandleID="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Workload="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.701 [INFO][4970] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" HandleID="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Workload="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000372c30), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-28-205", "pod":"coredns-668d6bf9bc-98wcf", "timestamp":"2025-05-27 02:49:22.700553751 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.701 [INFO][4970] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.874 [INFO][4970] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.874 [INFO][4970] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.906 [INFO][4970] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" host="ip-172-31-28-205" May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.930 [INFO][4970] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.950 [INFO][4970] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.959 [INFO][4970] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.122606 containerd[2012]: 2025-05-27 02:49:22.974 [INFO][4970] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.124130 containerd[2012]: 2025-05-27 02:49:22.974 [INFO][4970] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" host="ip-172-31-28-205" May 27 02:49:23.124130 containerd[2012]: 2025-05-27 02:49:22.977 [INFO][4970] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29 May 27 02:49:23.124130 containerd[2012]: 2025-05-27 02:49:23.000 [INFO][4970] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" host="ip-172-31-28-205" May 27 02:49:23.124130 containerd[2012]: 2025-05-27 02:49:23.016 [INFO][4970] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.68/26] block=192.168.22.64/26 handle="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" host="ip-172-31-28-205" May 27 02:49:23.124130 containerd[2012]: 2025-05-27 02:49:23.017 [INFO][4970] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.68/26] handle="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" host="ip-172-31-28-205" May 27 02:49:23.124130 containerd[2012]: 2025-05-27 02:49:23.017 [INFO][4970] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:23.124130 containerd[2012]: 2025-05-27 02:49:23.017 [INFO][4970] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.68/26] IPv6=[] ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" HandleID="k8s-pod-network.029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Workload="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" May 27 02:49:23.124458 containerd[2012]: 2025-05-27 02:49:23.028 [INFO][4907] cni-plugin/k8s.go 418: Populated endpoint ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"0f05cece-e259-4298-8a92-3ea1b39f2107", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"coredns-668d6bf9bc-98wcf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.22.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid062417242e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:23.124458 containerd[2012]: 2025-05-27 02:49:23.028 [INFO][4907] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.68/32] ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" May 27 02:49:23.124458 containerd[2012]: 2025-05-27 02:49:23.029 [INFO][4907] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid062417242e ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" May 27 02:49:23.124458 containerd[2012]: 2025-05-27 02:49:23.067 [INFO][4907] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" May 27 02:49:23.124458 containerd[2012]: 2025-05-27 02:49:23.071 [INFO][4907] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"0f05cece-e259-4298-8a92-3ea1b39f2107", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29", Pod:"coredns-668d6bf9bc-98wcf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.22.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid062417242e", MAC:"62:12:4b:93:82:08", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:23.124458 containerd[2012]: 2025-05-27 02:49:23.112 [INFO][4907] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" Namespace="kube-system" Pod="coredns-668d6bf9bc-98wcf" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--98wcf-eth0" May 27 02:49:23.207092 systemd[1]: Started cri-containerd-ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b.scope - libcontainer container ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b. May 27 02:49:23.208472 systemd-networkd[1823]: cali7eadd79106a: Link UP May 27 02:49:23.242794 systemd-networkd[1823]: cali7eadd79106a: Gained carrier May 27 02:49:23.263866 containerd[2012]: time="2025-05-27T02:49:23.263752717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hkfqv,Uid:01f59b83-1b36-4034-ac04-b88b89b8506a,Namespace:calico-system,Attempt:0,}" May 27 02:49:23.268721 containerd[2012]: time="2025-05-27T02:49:23.268666153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jxm2b,Uid:058da1c6-0f5e-41a7-b744-7dec794621a0,Namespace:kube-system,Attempt:0,}" May 27 02:49:23.355373 containerd[2012]: time="2025-05-27T02:49:23.354946898Z" level=info msg="connecting to shim 029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29" address="unix:///run/containerd/s/b26d11f077b9453cd1141116402c38d0ce8a06b30e31086a11eb617644bfbd66" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:22.584 [INFO][4897] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0 calico-apiserver-65948d5f- calico-apiserver f5c6704c-4387-4f61-8657-91f8646bee6a 811 0 2025-05-27 02:48:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:65948d5f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-28-205 calico-apiserver-65948d5f-rqgjr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7eadd79106a [] [] }} ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:22.587 [INFO][4897] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:22.725 [INFO][4973] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" HandleID="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Workload="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:22.725 [INFO][4973] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" HandleID="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Workload="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003998c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-28-205", "pod":"calico-apiserver-65948d5f-rqgjr", "timestamp":"2025-05-27 02:49:22.725230755 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:22.726 [INFO][4973] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.017 [INFO][4973] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.018 [INFO][4973] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.073 [INFO][4973] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.088 [INFO][4973] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.114 [INFO][4973] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.122 [INFO][4973] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.131 [INFO][4973] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.131 [INFO][4973] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.136 [INFO][4973] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0 May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.145 [INFO][4973] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.173 [INFO][4973] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.69/26] block=192.168.22.64/26 handle="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.176 [INFO][4973] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.69/26] handle="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" host="ip-172-31-28-205" May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.176 [INFO][4973] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:23.363276 containerd[2012]: 2025-05-27 02:49:23.176 [INFO][4973] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.69/26] IPv6=[] ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" HandleID="k8s-pod-network.7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Workload="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" May 27 02:49:23.364295 containerd[2012]: 2025-05-27 02:49:23.193 [INFO][4897] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0", GenerateName:"calico-apiserver-65948d5f-", Namespace:"calico-apiserver", SelfLink:"", UID:"f5c6704c-4387-4f61-8657-91f8646bee6a", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65948d5f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"calico-apiserver-65948d5f-rqgjr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.22.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7eadd79106a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:23.364295 containerd[2012]: 2025-05-27 02:49:23.193 [INFO][4897] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.69/32] ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" May 27 02:49:23.364295 containerd[2012]: 2025-05-27 02:49:23.195 [INFO][4897] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7eadd79106a ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" May 27 02:49:23.364295 containerd[2012]: 2025-05-27 02:49:23.251 [INFO][4897] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" May 27 02:49:23.364295 containerd[2012]: 2025-05-27 02:49:23.256 [INFO][4897] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0", GenerateName:"calico-apiserver-65948d5f-", Namespace:"calico-apiserver", SelfLink:"", UID:"f5c6704c-4387-4f61-8657-91f8646bee6a", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"65948d5f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0", Pod:"calico-apiserver-65948d5f-rqgjr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.22.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7eadd79106a", MAC:"3a:a9:27:4d:90:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:23.364295 containerd[2012]: 2025-05-27 02:49:23.331 [INFO][4897] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" Namespace="calico-apiserver" Pod="calico-apiserver-65948d5f-rqgjr" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--apiserver--65948d5f--rqgjr-eth0" May 27 02:49:23.481824 systemd-networkd[1823]: calib94ec309c9e: Link UP May 27 02:49:23.499053 systemd-networkd[1823]: calib94ec309c9e: Gained carrier May 27 02:49:23.582400 systemd[1]: Started cri-containerd-029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29.scope - libcontainer container 029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29. May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:22.580 [INFO][4931] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0 calico-kube-controllers-6f5d8dffc5- calico-system 24c42b60-4e50-4df2-b057-56dcd0d2d906 809 0 2025-05-27 02:49:00 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6f5d8dffc5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-28-205 calico-kube-controllers-6f5d8dffc5-jknpx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib94ec309c9e [] [] }} ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:22.583 [INFO][4931] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:22.734 [INFO][4978] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" HandleID="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Workload="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:22.735 [INFO][4978] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" HandleID="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Workload="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400036af50), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-28-205", "pod":"calico-kube-controllers-6f5d8dffc5-jknpx", "timestamp":"2025-05-27 02:49:22.734915667 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:22.735 [INFO][4978] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.176 [INFO][4978] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.177 [INFO][4978] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.213 [INFO][4978] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.248 [INFO][4978] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.308 [INFO][4978] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.324 [INFO][4978] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.339 [INFO][4978] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.339 [INFO][4978] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.349 [INFO][4978] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82 May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.383 [INFO][4978] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.404 [INFO][4978] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.70/26] block=192.168.22.64/26 handle="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.404 [INFO][4978] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.70/26] handle="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" host="ip-172-31-28-205" May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.404 [INFO][4978] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:23.593670 containerd[2012]: 2025-05-27 02:49:23.404 [INFO][4978] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.70/26] IPv6=[] ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" HandleID="k8s-pod-network.33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Workload="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" May 27 02:49:23.596670 containerd[2012]: 2025-05-27 02:49:23.447 [INFO][4931] cni-plugin/k8s.go 418: Populated endpoint ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0", GenerateName:"calico-kube-controllers-6f5d8dffc5-", Namespace:"calico-system", SelfLink:"", UID:"24c42b60-4e50-4df2-b057-56dcd0d2d906", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f5d8dffc5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"calico-kube-controllers-6f5d8dffc5-jknpx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.22.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib94ec309c9e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:23.596670 containerd[2012]: 2025-05-27 02:49:23.447 [INFO][4931] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.70/32] ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" May 27 02:49:23.596670 containerd[2012]: 2025-05-27 02:49:23.453 [INFO][4931] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib94ec309c9e ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" May 27 02:49:23.596670 containerd[2012]: 2025-05-27 02:49:23.501 [INFO][4931] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" May 27 02:49:23.596670 containerd[2012]: 2025-05-27 02:49:23.519 [INFO][4931] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0", GenerateName:"calico-kube-controllers-6f5d8dffc5-", Namespace:"calico-system", SelfLink:"", UID:"24c42b60-4e50-4df2-b057-56dcd0d2d906", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f5d8dffc5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82", Pod:"calico-kube-controllers-6f5d8dffc5-jknpx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.22.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib94ec309c9e", MAC:"96:cc:a0:41:db:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:23.596670 containerd[2012]: 2025-05-27 02:49:23.570 [INFO][4931] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" Namespace="calico-system" Pod="calico-kube-controllers-6f5d8dffc5-jknpx" WorkloadEndpoint="ip--172--31--28--205-k8s-calico--kube--controllers--6f5d8dffc5--jknpx-eth0" May 27 02:49:23.634363 containerd[2012]: time="2025-05-27T02:49:23.634216647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rkrb7,Uid:832199df-a524-40b6-bb28-904e7566a390,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d\"" May 27 02:49:23.649728 containerd[2012]: time="2025-05-27T02:49:23.649675779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 02:49:23.701135 containerd[2012]: time="2025-05-27T02:49:23.700916968Z" level=info msg="connecting to shim 7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0" address="unix:///run/containerd/s/ff0d89b4342d12bfc23754077b105e0fe668d60a8833e53276d0d94d2b00c159" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:23.726096 containerd[2012]: time="2025-05-27T02:49:23.725974528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9dsg5,Uid:62399f24-73bb-46f6-a534-e7c43ecd7271,Namespace:calico-system,Attempt:0,} returns sandbox id \"ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b\"" May 27 02:49:23.767511 containerd[2012]: time="2025-05-27T02:49:23.767342776Z" level=info msg="connecting to shim 33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82" address="unix:///run/containerd/s/ba7d9cf16a6f2d83b5d8b7083cb13dbac3725b06ea497ded6ea471902d163260" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:23.860882 containerd[2012]: time="2025-05-27T02:49:23.860675656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-98wcf,Uid:0f05cece-e259-4298-8a92-3ea1b39f2107,Namespace:kube-system,Attempt:0,} returns sandbox id \"029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29\"" May 27 02:49:23.872651 containerd[2012]: time="2025-05-27T02:49:23.872517076Z" level=info msg="CreateContainer within sandbox \"029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 02:49:23.926981 containerd[2012]: time="2025-05-27T02:49:23.926560637Z" level=info msg="Container cdde7518a4c2ea1809476fbb124d2fd11a2146cb7bbd9f9c0b6cf41b9e7fd7ab: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:23.941820 systemd[1]: Started cri-containerd-7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0.scope - libcontainer container 7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0. May 27 02:49:23.955794 containerd[2012]: time="2025-05-27T02:49:23.955228169Z" level=info msg="CreateContainer within sandbox \"029d2402d215f5ffe1f674b0d19d6995222a8dc20946091c616afce4affccc29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cdde7518a4c2ea1809476fbb124d2fd11a2146cb7bbd9f9c0b6cf41b9e7fd7ab\"" May 27 02:49:23.965132 containerd[2012]: time="2025-05-27T02:49:23.964656677Z" level=info msg="StartContainer for \"cdde7518a4c2ea1809476fbb124d2fd11a2146cb7bbd9f9c0b6cf41b9e7fd7ab\"" May 27 02:49:23.989062 containerd[2012]: time="2025-05-27T02:49:23.987914165Z" level=info msg="connecting to shim cdde7518a4c2ea1809476fbb124d2fd11a2146cb7bbd9f9c0b6cf41b9e7fd7ab" address="unix:///run/containerd/s/b26d11f077b9453cd1141116402c38d0ce8a06b30e31086a11eb617644bfbd66" protocol=ttrpc version=3 May 27 02:49:24.007025 systemd[1]: Started cri-containerd-33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82.scope - libcontainer container 33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82. May 27 02:49:24.104624 systemd[1]: Started cri-containerd-cdde7518a4c2ea1809476fbb124d2fd11a2146cb7bbd9f9c0b6cf41b9e7fd7ab.scope - libcontainer container cdde7518a4c2ea1809476fbb124d2fd11a2146cb7bbd9f9c0b6cf41b9e7fd7ab. May 27 02:49:24.166023 systemd-networkd[1823]: cali3ede44ad77c: Link UP May 27 02:49:24.175278 systemd-networkd[1823]: cali3ede44ad77c: Gained carrier May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.669 [INFO][5111] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0 coredns-668d6bf9bc- kube-system 058da1c6-0f5e-41a7-b744-7dec794621a0 808 0 2025-05-27 02:48:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-28-205 coredns-668d6bf9bc-jxm2b eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3ede44ad77c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.676 [INFO][5111] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.892 [INFO][5205] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" HandleID="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Workload="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.892 [INFO][5205] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" HandleID="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Workload="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f3e90), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-28-205", "pod":"coredns-668d6bf9bc-jxm2b", "timestamp":"2025-05-27 02:49:23.892046044 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.892 [INFO][5205] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.892 [INFO][5205] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.892 [INFO][5205] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.938 [INFO][5205] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.954 [INFO][5205] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.982 [INFO][5205] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:23.990 [INFO][5205] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.004 [INFO][5205] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.004 [INFO][5205] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.017 [INFO][5205] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.051 [INFO][5205] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.076 [INFO][5205] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.71/26] block=192.168.22.64/26 handle="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.078 [INFO][5205] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.71/26] handle="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" host="ip-172-31-28-205" May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.079 [INFO][5205] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:24.223353 containerd[2012]: 2025-05-27 02:49:24.079 [INFO][5205] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.71/26] IPv6=[] ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" HandleID="k8s-pod-network.534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Workload="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" May 27 02:49:24.224467 containerd[2012]: 2025-05-27 02:49:24.091 [INFO][5111] cni-plugin/k8s.go 418: Populated endpoint ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"058da1c6-0f5e-41a7-b744-7dec794621a0", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"coredns-668d6bf9bc-jxm2b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.22.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3ede44ad77c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:24.224467 containerd[2012]: 2025-05-27 02:49:24.093 [INFO][5111] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.71/32] ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" May 27 02:49:24.224467 containerd[2012]: 2025-05-27 02:49:24.093 [INFO][5111] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3ede44ad77c ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" May 27 02:49:24.224467 containerd[2012]: 2025-05-27 02:49:24.179 [INFO][5111] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" May 27 02:49:24.224467 containerd[2012]: 2025-05-27 02:49:24.181 [INFO][5111] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"058da1c6-0f5e-41a7-b744-7dec794621a0", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba", Pod:"coredns-668d6bf9bc-jxm2b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.22.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3ede44ad77c", MAC:"66:56:43:5f:f5:e6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:24.224467 containerd[2012]: 2025-05-27 02:49:24.214 [INFO][5111] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" Namespace="kube-system" Pod="coredns-668d6bf9bc-jxm2b" WorkloadEndpoint="ip--172--31--28--205-k8s-coredns--668d6bf9bc--jxm2b-eth0" May 27 02:49:24.255847 systemd-networkd[1823]: cali9c06e7da54c: Gained IPv6LL May 27 02:49:24.425254 containerd[2012]: time="2025-05-27T02:49:24.422648679Z" level=info msg="connecting to shim 534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba" address="unix:///run/containerd/s/cd9d2f1f9434f7948ef79d78eda02dfaff0a32a2252f7cc460dc078be8157597" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:24.442230 containerd[2012]: time="2025-05-27T02:49:24.442088631Z" level=info msg="StartContainer for \"cdde7518a4c2ea1809476fbb124d2fd11a2146cb7bbd9f9c0b6cf41b9e7fd7ab\" returns successfully" May 27 02:49:24.448986 systemd-networkd[1823]: cali7cbdc21eeb2: Link UP May 27 02:49:24.449425 systemd-networkd[1823]: cali7cbdc21eeb2: Gained carrier May 27 02:49:24.496142 containerd[2012]: time="2025-05-27T02:49:24.496049919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-65948d5f-rqgjr,Uid:f5c6704c-4387-4f61-8657-91f8646bee6a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0\"" May 27 02:49:24.511132 systemd-networkd[1823]: calid062417242e: Gained IPv6LL May 27 02:49:24.545910 containerd[2012]: time="2025-05-27T02:49:24.542316220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f5d8dffc5-jknpx,Uid:24c42b60-4e50-4df2-b057-56dcd0d2d906,Namespace:calico-system,Attempt:0,} returns sandbox id \"33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82\"" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:23.822 [INFO][5109] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0 goldmane-78d55f7ddc- calico-system 01f59b83-1b36-4034-ac04-b88b89b8506a 807 0 2025-05-27 02:49:00 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-28-205 goldmane-78d55f7ddc-hkfqv eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7cbdc21eeb2 [] [] }} ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:23.825 [INFO][5109] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.106 [INFO][5246] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" HandleID="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Workload="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.107 [INFO][5246] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" HandleID="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Workload="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400065d030), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-28-205", "pod":"goldmane-78d55f7ddc-hkfqv", "timestamp":"2025-05-27 02:49:24.106966706 +0000 UTC"}, Hostname:"ip-172-31-28-205", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.107 [INFO][5246] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.108 [INFO][5246] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.108 [INFO][5246] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-205' May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.170 [INFO][5246] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.196 [INFO][5246] ipam/ipam.go 394: Looking up existing affinities for host host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.207 [INFO][5246] ipam/ipam.go 511: Trying affinity for 192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.219 [INFO][5246] ipam/ipam.go 158: Attempting to load block cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.233 [INFO][5246] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.22.64/26 host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.234 [INFO][5246] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.22.64/26 handle="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.239 [INFO][5246] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08 May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.261 [INFO][5246] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.22.64/26 handle="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.321 [INFO][5246] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.22.72/26] block=192.168.22.64/26 handle="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.321 [INFO][5246] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.22.72/26] handle="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" host="ip-172-31-28-205" May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.321 [INFO][5246] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:24.553780 containerd[2012]: 2025-05-27 02:49:24.321 [INFO][5246] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.22.72/26] IPv6=[] ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" HandleID="k8s-pod-network.e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Workload="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" May 27 02:49:24.554903 containerd[2012]: 2025-05-27 02:49:24.390 [INFO][5109] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"01f59b83-1b36-4034-ac04-b88b89b8506a", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"", Pod:"goldmane-78d55f7ddc-hkfqv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.22.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7cbdc21eeb2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:24.554903 containerd[2012]: 2025-05-27 02:49:24.396 [INFO][5109] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.22.72/32] ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" May 27 02:49:24.554903 containerd[2012]: 2025-05-27 02:49:24.405 [INFO][5109] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7cbdc21eeb2 ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" May 27 02:49:24.554903 containerd[2012]: 2025-05-27 02:49:24.466 [INFO][5109] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" May 27 02:49:24.554903 containerd[2012]: 2025-05-27 02:49:24.471 [INFO][5109] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"01f59b83-1b36-4034-ac04-b88b89b8506a", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-205", ContainerID:"e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08", Pod:"goldmane-78d55f7ddc-hkfqv", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.22.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7cbdc21eeb2", MAC:"ba:f7:42:dc:5a:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:24.554903 containerd[2012]: 2025-05-27 02:49:24.508 [INFO][5109] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hkfqv" WorkloadEndpoint="ip--172--31--28--205-k8s-goldmane--78d55f7ddc--hkfqv-eth0" May 27 02:49:24.640893 systemd[1]: Started cri-containerd-534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba.scope - libcontainer container 534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba. May 27 02:49:24.686328 containerd[2012]: time="2025-05-27T02:49:24.686086192Z" level=info msg="connecting to shim e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08" address="unix:///run/containerd/s/68f9ceabe032a10cdec21f026984bfd20b0a304d233493ffe12795ad0db608a1" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:24.703994 systemd-networkd[1823]: calib94ec309c9e: Gained IPv6LL May 27 02:49:24.767913 systemd-networkd[1823]: cali87992d1834a: Gained IPv6LL May 27 02:49:24.782316 kubelet[3281]: I0527 02:49:24.781145 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-98wcf" podStartSLOduration=45.781114781 podStartE2EDuration="45.781114781s" podCreationTimestamp="2025-05-27 02:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:49:24.738433889 +0000 UTC m=+49.803306153" watchObservedRunningTime="2025-05-27 02:49:24.781114781 +0000 UTC m=+49.845987045" May 27 02:49:24.815975 systemd[1]: Started cri-containerd-e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08.scope - libcontainer container e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08. May 27 02:49:24.887683 containerd[2012]: time="2025-05-27T02:49:24.887617769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jxm2b,Uid:058da1c6-0f5e-41a7-b744-7dec794621a0,Namespace:kube-system,Attempt:0,} returns sandbox id \"534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba\"" May 27 02:49:24.899659 containerd[2012]: time="2025-05-27T02:49:24.899526497Z" level=info msg="CreateContainer within sandbox \"534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 02:49:24.923098 containerd[2012]: time="2025-05-27T02:49:24.922482018Z" level=info msg="Container 82d8e1006ba35d17e858132ab70f16fd3dcee4481b7edc8ed01aab3c153062a7: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:24.948439 containerd[2012]: time="2025-05-27T02:49:24.947508438Z" level=info msg="CreateContainer within sandbox \"534c76032689a57e5ec33604c0e193605ee9f9432bf1266261a21f0eb05695ba\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"82d8e1006ba35d17e858132ab70f16fd3dcee4481b7edc8ed01aab3c153062a7\"" May 27 02:49:24.951684 containerd[2012]: time="2025-05-27T02:49:24.951601422Z" level=info msg="StartContainer for \"82d8e1006ba35d17e858132ab70f16fd3dcee4481b7edc8ed01aab3c153062a7\"" May 27 02:49:24.958622 containerd[2012]: time="2025-05-27T02:49:24.958538334Z" level=info msg="connecting to shim 82d8e1006ba35d17e858132ab70f16fd3dcee4481b7edc8ed01aab3c153062a7" address="unix:///run/containerd/s/cd9d2f1f9434f7948ef79d78eda02dfaff0a32a2252f7cc460dc078be8157597" protocol=ttrpc version=3 May 27 02:49:25.033093 systemd[1]: Started cri-containerd-82d8e1006ba35d17e858132ab70f16fd3dcee4481b7edc8ed01aab3c153062a7.scope - libcontainer container 82d8e1006ba35d17e858132ab70f16fd3dcee4481b7edc8ed01aab3c153062a7. May 27 02:49:25.087490 systemd-networkd[1823]: cali7eadd79106a: Gained IPv6LL May 27 02:49:25.285526 containerd[2012]: time="2025-05-27T02:49:25.285359427Z" level=info msg="StartContainer for \"82d8e1006ba35d17e858132ab70f16fd3dcee4481b7edc8ed01aab3c153062a7\" returns successfully" May 27 02:49:25.312030 containerd[2012]: time="2025-05-27T02:49:25.311953984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hkfqv,Uid:01f59b83-1b36-4034-ac04-b88b89b8506a,Namespace:calico-system,Attempt:0,} returns sandbox id \"e901fd844a073e0c46d7e5a4f4ae2fad6bbd436cfecdefec8e7ffae654ed5c08\"" May 27 02:49:25.345910 systemd-networkd[1823]: cali3ede44ad77c: Gained IPv6LL May 27 02:49:26.175603 systemd-networkd[1823]: cali7cbdc21eeb2: Gained IPv6LL May 27 02:49:26.754804 kubelet[3281]: I0527 02:49:26.754680 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-jxm2b" podStartSLOduration=47.754655251 podStartE2EDuration="47.754655251s" podCreationTimestamp="2025-05-27 02:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:49:25.747851646 +0000 UTC m=+50.812723994" watchObservedRunningTime="2025-05-27 02:49:26.754655251 +0000 UTC m=+51.819527503" May 27 02:49:26.973954 containerd[2012]: time="2025-05-27T02:49:26.973876964Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:26.978049 containerd[2012]: time="2025-05-27T02:49:26.977943536Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=44453213" May 27 02:49:26.981547 containerd[2012]: time="2025-05-27T02:49:26.981481340Z" level=info msg="ImageCreate event name:\"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:26.990623 containerd[2012]: time="2025-05-27T02:49:26.990541580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:26.993641 containerd[2012]: time="2025-05-27T02:49:26.993549068Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 3.337532405s" May 27 02:49:26.993641 containerd[2012]: time="2025-05-27T02:49:26.993640268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 02:49:26.999012 containerd[2012]: time="2025-05-27T02:49:26.998816984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 02:49:27.002510 containerd[2012]: time="2025-05-27T02:49:27.002442628Z" level=info msg="CreateContainer within sandbox \"772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 02:49:27.027609 containerd[2012]: time="2025-05-27T02:49:27.023897752Z" level=info msg="Container 885a3f4a2d108b5e9e1fc9fe15a775e39d44d754387f41dd7848dff5eec25bf8: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:27.044133 containerd[2012]: time="2025-05-27T02:49:27.043994464Z" level=info msg="CreateContainer within sandbox \"772eea205d6bf004b09393626f397c5e3571a4b35820f9be549e89eb0d59641d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"885a3f4a2d108b5e9e1fc9fe15a775e39d44d754387f41dd7848dff5eec25bf8\"" May 27 02:49:27.046842 containerd[2012]: time="2025-05-27T02:49:27.045153172Z" level=info msg="StartContainer for \"885a3f4a2d108b5e9e1fc9fe15a775e39d44d754387f41dd7848dff5eec25bf8\"" May 27 02:49:27.049540 containerd[2012]: time="2025-05-27T02:49:27.049480360Z" level=info msg="connecting to shim 885a3f4a2d108b5e9e1fc9fe15a775e39d44d754387f41dd7848dff5eec25bf8" address="unix:///run/containerd/s/b71a008631b1bf1423467c36227506db1e7e2e68ca78ac6818d78aa06194170a" protocol=ttrpc version=3 May 27 02:49:27.104111 systemd[1]: Started cri-containerd-885a3f4a2d108b5e9e1fc9fe15a775e39d44d754387f41dd7848dff5eec25bf8.scope - libcontainer container 885a3f4a2d108b5e9e1fc9fe15a775e39d44d754387f41dd7848dff5eec25bf8. May 27 02:49:27.215671 containerd[2012]: time="2025-05-27T02:49:27.215577989Z" level=info msg="StartContainer for \"885a3f4a2d108b5e9e1fc9fe15a775e39d44d754387f41dd7848dff5eec25bf8\" returns successfully" May 27 02:49:27.747837 kubelet[3281]: I0527 02:49:27.747206 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-65948d5f-rkrb7" podStartSLOduration=31.398730351 podStartE2EDuration="34.74718194s" podCreationTimestamp="2025-05-27 02:48:53 +0000 UTC" firstStartedPulling="2025-05-27 02:49:23.648360555 +0000 UTC m=+48.713232831" lastFinishedPulling="2025-05-27 02:49:26.996812156 +0000 UTC m=+52.061684420" observedRunningTime="2025-05-27 02:49:27.746152352 +0000 UTC m=+52.811024628" watchObservedRunningTime="2025-05-27 02:49:27.74718194 +0000 UTC m=+52.812054204" May 27 02:49:28.406966 containerd[2012]: time="2025-05-27T02:49:28.405845227Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:28.409995 containerd[2012]: time="2025-05-27T02:49:28.409947703Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8226240" May 27 02:49:28.412253 containerd[2012]: time="2025-05-27T02:49:28.412203175Z" level=info msg="ImageCreate event name:\"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:28.418436 containerd[2012]: time="2025-05-27T02:49:28.418343455Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:28.421174 containerd[2012]: time="2025-05-27T02:49:28.421127671Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"9595481\" in 1.422115315s" May 27 02:49:28.421474 containerd[2012]: time="2025-05-27T02:49:28.421339171Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\"" May 27 02:49:28.423584 containerd[2012]: time="2025-05-27T02:49:28.423224059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 02:49:28.429360 containerd[2012]: time="2025-05-27T02:49:28.429155323Z" level=info msg="CreateContainer within sandbox \"ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 02:49:28.455811 containerd[2012]: time="2025-05-27T02:49:28.455044411Z" level=info msg="Container 1ddac3029b86c4f0c8795ece94a9b830ffa55eddbff9225314ba154aeab69082: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:28.470546 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2310569103.mount: Deactivated successfully. May 27 02:49:28.489432 containerd[2012]: time="2025-05-27T02:49:28.489345271Z" level=info msg="CreateContainer within sandbox \"ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"1ddac3029b86c4f0c8795ece94a9b830ffa55eddbff9225314ba154aeab69082\"" May 27 02:49:28.492869 containerd[2012]: time="2025-05-27T02:49:28.490225603Z" level=info msg="StartContainer for \"1ddac3029b86c4f0c8795ece94a9b830ffa55eddbff9225314ba154aeab69082\"" May 27 02:49:28.496640 containerd[2012]: time="2025-05-27T02:49:28.496527307Z" level=info msg="connecting to shim 1ddac3029b86c4f0c8795ece94a9b830ffa55eddbff9225314ba154aeab69082" address="unix:///run/containerd/s/0e607d18011b71843e90d0d89f8ade1b5e99e0fa3829b7c5345474caab41cf68" protocol=ttrpc version=3 May 27 02:49:28.558465 systemd[1]: Started cri-containerd-1ddac3029b86c4f0c8795ece94a9b830ffa55eddbff9225314ba154aeab69082.scope - libcontainer container 1ddac3029b86c4f0c8795ece94a9b830ffa55eddbff9225314ba154aeab69082. May 27 02:49:28.579086 ntpd[1971]: Listen normally on 7 vxlan.calico 192.168.22.64:123 May 27 02:49:28.580712 ntpd[1971]: Listen normally on 8 calibf10e1075d1 [fe80::ecee:eeff:feee:eeee%4]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 7 vxlan.calico 192.168.22.64:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 8 calibf10e1075d1 [fe80::ecee:eeff:feee:eeee%4]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 9 vxlan.calico [fe80::6442:aaff:fee7:c135%5]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 10 cali9c06e7da54c [fe80::ecee:eeff:feee:eeee%8]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 11 cali87992d1834a [fe80::ecee:eeff:feee:eeee%9]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 12 calid062417242e [fe80::ecee:eeff:feee:eeee%10]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 13 cali7eadd79106a [fe80::ecee:eeff:feee:eeee%11]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 14 calib94ec309c9e [fe80::ecee:eeff:feee:eeee%12]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 15 cali3ede44ad77c [fe80::ecee:eeff:feee:eeee%13]:123 May 27 02:49:28.581938 ntpd[1971]: 27 May 02:49:28 ntpd[1971]: Listen normally on 16 cali7cbdc21eeb2 [fe80::ecee:eeff:feee:eeee%14]:123 May 27 02:49:28.580833 ntpd[1971]: Listen normally on 9 vxlan.calico [fe80::6442:aaff:fee7:c135%5]:123 May 27 02:49:28.580928 ntpd[1971]: Listen normally on 10 cali9c06e7da54c [fe80::ecee:eeff:feee:eeee%8]:123 May 27 02:49:28.581005 ntpd[1971]: Listen normally on 11 cali87992d1834a [fe80::ecee:eeff:feee:eeee%9]:123 May 27 02:49:28.581072 ntpd[1971]: Listen normally on 12 calid062417242e [fe80::ecee:eeff:feee:eeee%10]:123 May 27 02:49:28.581138 ntpd[1971]: Listen normally on 13 cali7eadd79106a [fe80::ecee:eeff:feee:eeee%11]:123 May 27 02:49:28.581200 ntpd[1971]: Listen normally on 14 calib94ec309c9e [fe80::ecee:eeff:feee:eeee%12]:123 May 27 02:49:28.581261 ntpd[1971]: Listen normally on 15 cali3ede44ad77c [fe80::ecee:eeff:feee:eeee%13]:123 May 27 02:49:28.581325 ntpd[1971]: Listen normally on 16 cali7cbdc21eeb2 [fe80::ecee:eeff:feee:eeee%14]:123 May 27 02:49:28.777888 containerd[2012]: time="2025-05-27T02:49:28.777818181Z" level=info msg="StartContainer for \"1ddac3029b86c4f0c8795ece94a9b830ffa55eddbff9225314ba154aeab69082\" returns successfully" May 27 02:49:28.791693 containerd[2012]: time="2025-05-27T02:49:28.791560245Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:28.796382 containerd[2012]: time="2025-05-27T02:49:28.794928873Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 02:49:28.804036 containerd[2012]: time="2025-05-27T02:49:28.803962005Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 380.666786ms" May 27 02:49:28.804036 containerd[2012]: time="2025-05-27T02:49:28.804027873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 02:49:28.812184 containerd[2012]: time="2025-05-27T02:49:28.812134677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 02:49:28.814918 containerd[2012]: time="2025-05-27T02:49:28.814851585Z" level=info msg="CreateContainer within sandbox \"7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 02:49:28.852809 containerd[2012]: time="2025-05-27T02:49:28.852533745Z" level=info msg="Container 1d5394c7b02ef153382332b30087bf8054da97414244dcc6a59d7f99c3c6611b: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:28.862955 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3927369151.mount: Deactivated successfully. May 27 02:49:28.875955 containerd[2012]: time="2025-05-27T02:49:28.875485605Z" level=info msg="CreateContainer within sandbox \"7637b88fc6e40e9f35eb0211210683f457c277995f05587a82e1392099c41bb0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1d5394c7b02ef153382332b30087bf8054da97414244dcc6a59d7f99c3c6611b\"" May 27 02:49:28.878206 containerd[2012]: time="2025-05-27T02:49:28.878142789Z" level=info msg="StartContainer for \"1d5394c7b02ef153382332b30087bf8054da97414244dcc6a59d7f99c3c6611b\"" May 27 02:49:28.883313 containerd[2012]: time="2025-05-27T02:49:28.883227033Z" level=info msg="connecting to shim 1d5394c7b02ef153382332b30087bf8054da97414244dcc6a59d7f99c3c6611b" address="unix:///run/containerd/s/ff0d89b4342d12bfc23754077b105e0fe668d60a8833e53276d0d94d2b00c159" protocol=ttrpc version=3 May 27 02:49:28.930226 systemd[1]: Started cri-containerd-1d5394c7b02ef153382332b30087bf8054da97414244dcc6a59d7f99c3c6611b.scope - libcontainer container 1d5394c7b02ef153382332b30087bf8054da97414244dcc6a59d7f99c3c6611b. May 27 02:49:29.042518 containerd[2012]: time="2025-05-27T02:49:29.041311926Z" level=info msg="StartContainer for \"1d5394c7b02ef153382332b30087bf8054da97414244dcc6a59d7f99c3c6611b\" returns successfully" May 27 02:49:30.169717 kubelet[3281]: I0527 02:49:30.169615 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-65948d5f-rqgjr" podStartSLOduration=32.868561027 podStartE2EDuration="37.169592072s" podCreationTimestamp="2025-05-27 02:48:53 +0000 UTC" firstStartedPulling="2025-05-27 02:49:24.506301844 +0000 UTC m=+49.571174108" lastFinishedPulling="2025-05-27 02:49:28.807332817 +0000 UTC m=+53.872205153" observedRunningTime="2025-05-27 02:49:29.796117486 +0000 UTC m=+54.860989786" watchObservedRunningTime="2025-05-27 02:49:30.169592072 +0000 UTC m=+55.234464348" May 27 02:49:30.774737 kubelet[3281]: I0527 02:49:30.774466 3281 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:34.467840 containerd[2012]: time="2025-05-27T02:49:34.467246509Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:34.469971 containerd[2012]: time="2025-05-27T02:49:34.469912117Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=48045219" May 27 02:49:34.477880 containerd[2012]: time="2025-05-27T02:49:34.476069353Z" level=info msg="ImageCreate event name:\"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:34.480188 containerd[2012]: time="2025-05-27T02:49:34.480007009Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:34.482512 containerd[2012]: time="2025-05-27T02:49:34.481606117Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"49414428\" in 5.668093792s" May 27 02:49:34.482512 containerd[2012]: time="2025-05-27T02:49:34.481654405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\"" May 27 02:49:34.486143 containerd[2012]: time="2025-05-27T02:49:34.486061405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:49:34.514311 containerd[2012]: time="2025-05-27T02:49:34.514251433Z" level=info msg="CreateContainer within sandbox \"33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 02:49:34.569615 containerd[2012]: time="2025-05-27T02:49:34.569550506Z" level=info msg="Container 7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:34.583501 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1940836904.mount: Deactivated successfully. May 27 02:49:34.623296 containerd[2012]: time="2025-05-27T02:49:34.623230694Z" level=info msg="CreateContainer within sandbox \"33ba18f84a1f2560f0e2d50a650b835d149c9eb88873c78a388f036a1eb86d82\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\"" May 27 02:49:34.624608 containerd[2012]: time="2025-05-27T02:49:34.624518042Z" level=info msg="StartContainer for \"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\"" May 27 02:49:34.627079 containerd[2012]: time="2025-05-27T02:49:34.626959658Z" level=info msg="connecting to shim 7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437" address="unix:///run/containerd/s/ba7d9cf16a6f2d83b5d8b7083cb13dbac3725b06ea497ded6ea471902d163260" protocol=ttrpc version=3 May 27 02:49:34.671103 systemd[1]: Started cri-containerd-7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437.scope - libcontainer container 7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437. May 27 02:49:34.758121 containerd[2012]: time="2025-05-27T02:49:34.757924502Z" level=info msg="StartContainer for \"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\" returns successfully" May 27 02:49:34.792952 containerd[2012]: time="2025-05-27T02:49:34.792881727Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:34.795812 containerd[2012]: time="2025-05-27T02:49:34.795536415Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:34.795812 containerd[2012]: time="2025-05-27T02:49:34.795718983Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:49:34.797113 kubelet[3281]: E0527 02:49:34.796308 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:34.797113 kubelet[3281]: E0527 02:49:34.796379 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:34.798705 containerd[2012]: time="2025-05-27T02:49:34.798324567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 02:49:34.802756 kubelet[3281]: E0527 02:49:34.801255 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5drb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hkfqv_calico-system(01f59b83-1b36-4034-ac04-b88b89b8506a): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:34.805818 kubelet[3281]: E0527 02:49:34.804707 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:49:34.945076 containerd[2012]: time="2025-05-27T02:49:34.944619579Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\" id:\"bfa8bef82b2748b918b1a84aa53a91e5f1ac35d70aa75e3b64789a7c9cfa4e28\" pid:5682 exited_at:{seconds:1748314174 nanos:941190483}" May 27 02:49:34.977307 kubelet[3281]: I0527 02:49:34.977215 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6f5d8dffc5-jknpx" podStartSLOduration=25.050166531 podStartE2EDuration="34.977194456s" podCreationTimestamp="2025-05-27 02:49:00 +0000 UTC" firstStartedPulling="2025-05-27 02:49:24.557295736 +0000 UTC m=+49.622168000" lastFinishedPulling="2025-05-27 02:49:34.484323661 +0000 UTC m=+59.549195925" observedRunningTime="2025-05-27 02:49:34.843490827 +0000 UTC m=+59.908363115" watchObservedRunningTime="2025-05-27 02:49:34.977194456 +0000 UTC m=+60.042066720" May 27 02:49:35.811176 kubelet[3281]: E0527 02:49:35.810671 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:49:36.430809 containerd[2012]: time="2025-05-27T02:49:36.430342275Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:36.433725 containerd[2012]: time="2025-05-27T02:49:36.433208487Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=13749925" May 27 02:49:36.435435 containerd[2012]: time="2025-05-27T02:49:36.435339759Z" level=info msg="ImageCreate event name:\"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:36.449254 containerd[2012]: time="2025-05-27T02:49:36.449177631Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:36.452866 containerd[2012]: time="2025-05-27T02:49:36.452709459Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"15119118\" in 1.654332308s" May 27 02:49:36.453976 containerd[2012]: time="2025-05-27T02:49:36.452944239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\"" May 27 02:49:36.456708 containerd[2012]: time="2025-05-27T02:49:36.456083559Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:49:36.460515 containerd[2012]: time="2025-05-27T02:49:36.460449375Z" level=info msg="CreateContainer within sandbox \"ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 02:49:36.475083 containerd[2012]: time="2025-05-27T02:49:36.475012287Z" level=info msg="Container 86703034f71c2a8baadb6709d5318dd572afa5239e2c881993ae21d0880f837e: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:36.492592 containerd[2012]: time="2025-05-27T02:49:36.492524439Z" level=info msg="CreateContainer within sandbox \"ca3ac5551ee37447bc9a8d562f30faf3f8af4f10027c4a11d389f3cda35a865b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"86703034f71c2a8baadb6709d5318dd572afa5239e2c881993ae21d0880f837e\"" May 27 02:49:36.494535 containerd[2012]: time="2025-05-27T02:49:36.494304939Z" level=info msg="StartContainer for \"86703034f71c2a8baadb6709d5318dd572afa5239e2c881993ae21d0880f837e\"" May 27 02:49:36.502804 containerd[2012]: time="2025-05-27T02:49:36.502212099Z" level=info msg="connecting to shim 86703034f71c2a8baadb6709d5318dd572afa5239e2c881993ae21d0880f837e" address="unix:///run/containerd/s/0e607d18011b71843e90d0d89f8ade1b5e99e0fa3829b7c5345474caab41cf68" protocol=ttrpc version=3 May 27 02:49:36.545090 systemd[1]: Started cri-containerd-86703034f71c2a8baadb6709d5318dd572afa5239e2c881993ae21d0880f837e.scope - libcontainer container 86703034f71c2a8baadb6709d5318dd572afa5239e2c881993ae21d0880f837e. May 27 02:49:36.628616 containerd[2012]: time="2025-05-27T02:49:36.628549876Z" level=info msg="StartContainer for \"86703034f71c2a8baadb6709d5318dd572afa5239e2c881993ae21d0880f837e\" returns successfully" May 27 02:49:36.660579 containerd[2012]: time="2025-05-27T02:49:36.660511216Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:36.661568 containerd[2012]: time="2025-05-27T02:49:36.661500904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:36.661820 containerd[2012]: time="2025-05-27T02:49:36.661541596Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:49:36.662499 kubelet[3281]: E0527 02:49:36.662010 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:36.662924 kubelet[3281]: E0527 02:49:36.662723 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:36.664326 kubelet[3281]: E0527 02:49:36.663883 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1327877524034c62a2710c5fbd6feb5f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:36.668838 containerd[2012]: time="2025-05-27T02:49:36.668486512Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:49:36.842385 kubelet[3281]: I0527 02:49:36.842235 3281 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9dsg5" podStartSLOduration=24.120847034 podStartE2EDuration="36.842059349s" podCreationTimestamp="2025-05-27 02:49:00 +0000 UTC" firstStartedPulling="2025-05-27 02:49:23.734152048 +0000 UTC m=+48.799024300" lastFinishedPulling="2025-05-27 02:49:36.455364351 +0000 UTC m=+61.520236615" observedRunningTime="2025-05-27 02:49:36.839391653 +0000 UTC m=+61.904263929" watchObservedRunningTime="2025-05-27 02:49:36.842059349 +0000 UTC m=+61.906931697" May 27 02:49:36.853521 containerd[2012]: time="2025-05-27T02:49:36.852955949Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:36.855804 containerd[2012]: time="2025-05-27T02:49:36.854893013Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:36.855804 containerd[2012]: time="2025-05-27T02:49:36.854950157Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:49:36.856101 kubelet[3281]: E0527 02:49:36.855292 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:36.856101 kubelet[3281]: E0527 02:49:36.855362 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:36.856101 kubelet[3281]: E0527 02:49:36.855508 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:36.857286 kubelet[3281]: E0527 02:49:36.857189 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:49:37.425261 kubelet[3281]: I0527 02:49:37.425209 3281 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 02:49:37.425261 kubelet[3281]: I0527 02:49:37.425264 3281 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 02:49:39.553313 systemd[1]: Started sshd@7-172.31.28.205:22-139.178.68.195:40742.service - OpenSSH per-connection server daemon (139.178.68.195:40742). May 27 02:49:39.769986 sshd[5743]: Accepted publickey for core from 139.178.68.195 port 40742 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:49:39.773265 sshd-session[5743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:49:39.783856 systemd-logind[1978]: New session 8 of user core. May 27 02:49:39.789150 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 02:49:40.209535 sshd[5745]: Connection closed by 139.178.68.195 port 40742 May 27 02:49:40.209405 sshd-session[5743]: pam_unix(sshd:session): session closed for user core May 27 02:49:40.222300 systemd[1]: sshd@7-172.31.28.205:22-139.178.68.195:40742.service: Deactivated successfully. May 27 02:49:40.229941 systemd[1]: session-8.scope: Deactivated successfully. May 27 02:49:40.234312 systemd-logind[1978]: Session 8 logged out. Waiting for processes to exit. May 27 02:49:40.238556 systemd-logind[1978]: Removed session 8. May 27 02:49:45.250725 systemd[1]: Started sshd@8-172.31.28.205:22-139.178.68.195:33548.service - OpenSSH per-connection server daemon (139.178.68.195:33548). May 27 02:49:45.458537 sshd[5775]: Accepted publickey for core from 139.178.68.195 port 33548 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:49:45.461389 sshd-session[5775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:49:45.470871 systemd-logind[1978]: New session 9 of user core. May 27 02:49:45.474297 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 02:49:45.730261 sshd[5777]: Connection closed by 139.178.68.195 port 33548 May 27 02:49:45.730037 sshd-session[5775]: pam_unix(sshd:session): session closed for user core May 27 02:49:45.736598 systemd[1]: sshd@8-172.31.28.205:22-139.178.68.195:33548.service: Deactivated successfully. May 27 02:49:45.742758 systemd[1]: session-9.scope: Deactivated successfully. May 27 02:49:45.748560 systemd-logind[1978]: Session 9 logged out. Waiting for processes to exit. May 27 02:49:45.753127 systemd-logind[1978]: Removed session 9. May 27 02:49:47.260790 kubelet[3281]: E0527 02:49:47.260551 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:49:49.259162 containerd[2012]: time="2025-05-27T02:49:49.259092074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:49:49.456491 containerd[2012]: time="2025-05-27T02:49:49.456382791Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:49.459305 containerd[2012]: time="2025-05-27T02:49:49.459142095Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:49.459476 containerd[2012]: time="2025-05-27T02:49:49.459142419Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:49:49.460343 kubelet[3281]: E0527 02:49:49.459562 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:49.460343 kubelet[3281]: E0527 02:49:49.459632 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:49.460343 kubelet[3281]: E0527 02:49:49.459918 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5drb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hkfqv_calico-system(01f59b83-1b36-4034-ac04-b88b89b8506a): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:49.462308 kubelet[3281]: E0527 02:49:49.462040 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:49:49.757663 containerd[2012]: time="2025-05-27T02:49:49.757581077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\" id:\"743a07d096956579bc61ec07a00bb0b84dca684dfc2b3773f6b2ac07d776cb31\" pid:5803 exited_at:{seconds:1748314189 nanos:756615365}" May 27 02:49:50.768169 systemd[1]: Started sshd@9-172.31.28.205:22-139.178.68.195:33550.service - OpenSSH per-connection server daemon (139.178.68.195:33550). May 27 02:49:50.974983 sshd[5816]: Accepted publickey for core from 139.178.68.195 port 33550 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:49:50.978189 sshd-session[5816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:49:50.988666 systemd-logind[1978]: New session 10 of user core. May 27 02:49:50.993098 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 02:49:51.266835 sshd[5818]: Connection closed by 139.178.68.195 port 33550 May 27 02:49:51.267830 sshd-session[5816]: pam_unix(sshd:session): session closed for user core May 27 02:49:51.275236 systemd[1]: sshd@9-172.31.28.205:22-139.178.68.195:33550.service: Deactivated successfully. May 27 02:49:51.278716 systemd[1]: session-10.scope: Deactivated successfully. May 27 02:49:51.280944 systemd-logind[1978]: Session 10 logged out. Waiting for processes to exit. May 27 02:49:51.283650 systemd-logind[1978]: Removed session 10. May 27 02:49:51.305964 systemd[1]: Started sshd@10-172.31.28.205:22-139.178.68.195:33560.service - OpenSSH per-connection server daemon (139.178.68.195:33560). May 27 02:49:51.501540 sshd[5831]: Accepted publickey for core from 139.178.68.195 port 33560 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:49:51.504716 sshd-session[5831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:49:51.512902 systemd-logind[1978]: New session 11 of user core. May 27 02:49:51.520245 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 02:49:51.857887 sshd[5833]: Connection closed by 139.178.68.195 port 33560 May 27 02:49:51.861085 sshd-session[5831]: pam_unix(sshd:session): session closed for user core May 27 02:49:51.873582 systemd[1]: sshd@10-172.31.28.205:22-139.178.68.195:33560.service: Deactivated successfully. May 27 02:49:51.883097 systemd[1]: session-11.scope: Deactivated successfully. May 27 02:49:51.886550 systemd-logind[1978]: Session 11 logged out. Waiting for processes to exit. May 27 02:49:51.914715 systemd[1]: Started sshd@11-172.31.28.205:22-139.178.68.195:33574.service - OpenSSH per-connection server daemon (139.178.68.195:33574). May 27 02:49:51.920964 systemd-logind[1978]: Removed session 11. May 27 02:49:52.122187 sshd[5843]: Accepted publickey for core from 139.178.68.195 port 33574 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:49:52.125173 sshd-session[5843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:49:52.133401 systemd-logind[1978]: New session 12 of user core. May 27 02:49:52.138034 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 02:49:52.395148 sshd[5845]: Connection closed by 139.178.68.195 port 33574 May 27 02:49:52.396209 sshd-session[5843]: pam_unix(sshd:session): session closed for user core May 27 02:49:52.403523 systemd[1]: sshd@11-172.31.28.205:22-139.178.68.195:33574.service: Deactivated successfully. May 27 02:49:52.407985 systemd[1]: session-12.scope: Deactivated successfully. May 27 02:49:52.411233 systemd-logind[1978]: Session 12 logged out. Waiting for processes to exit. May 27 02:49:52.414753 systemd-logind[1978]: Removed session 12. May 27 02:49:57.442346 systemd[1]: Started sshd@12-172.31.28.205:22-139.178.68.195:54390.service - OpenSSH per-connection server daemon (139.178.68.195:54390). May 27 02:49:57.641299 sshd[5867]: Accepted publickey for core from 139.178.68.195 port 54390 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:49:57.643936 sshd-session[5867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:49:57.652338 systemd-logind[1978]: New session 13 of user core. May 27 02:49:57.660089 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 02:49:57.925089 sshd[5869]: Connection closed by 139.178.68.195 port 54390 May 27 02:49:57.925849 sshd-session[5867]: pam_unix(sshd:session): session closed for user core May 27 02:49:57.932954 systemd-logind[1978]: Session 13 logged out. Waiting for processes to exit. May 27 02:49:57.933642 systemd[1]: sshd@12-172.31.28.205:22-139.178.68.195:54390.service: Deactivated successfully. May 27 02:49:57.937135 systemd[1]: session-13.scope: Deactivated successfully. May 27 02:49:57.943846 systemd-logind[1978]: Removed session 13. May 27 02:49:58.257545 containerd[2012]: time="2025-05-27T02:49:58.257493551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:49:58.449193 containerd[2012]: time="2025-05-27T02:49:58.449092308Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:58.451584 containerd[2012]: time="2025-05-27T02:49:58.451390344Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:58.451584 containerd[2012]: time="2025-05-27T02:49:58.451450284Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:49:58.451802 kubelet[3281]: E0527 02:49:58.451700 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:58.452398 kubelet[3281]: E0527 02:49:58.451794 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:58.452398 kubelet[3281]: E0527 02:49:58.451938 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1327877524034c62a2710c5fbd6feb5f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:58.457294 containerd[2012]: time="2025-05-27T02:49:58.457227012Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:49:58.662071 containerd[2012]: time="2025-05-27T02:49:58.661896949Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:58.664297 containerd[2012]: time="2025-05-27T02:49:58.664076665Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:58.664297 containerd[2012]: time="2025-05-27T02:49:58.664093477Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:49:58.664842 kubelet[3281]: E0527 02:49:58.664748 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:58.665000 kubelet[3281]: E0527 02:49:58.664855 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:58.665093 kubelet[3281]: E0527 02:49:58.665015 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:58.666747 kubelet[3281]: E0527 02:49:58.666649 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:50:00.256212 kubelet[3281]: E0527 02:50:00.256125 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:50:02.733814 kubelet[3281]: I0527 02:50:02.732387 3281 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:50:02.972208 systemd[1]: Started sshd@13-172.31.28.205:22-139.178.68.195:54402.service - OpenSSH per-connection server daemon (139.178.68.195:54402). May 27 02:50:03.195841 sshd[5889]: Accepted publickey for core from 139.178.68.195 port 54402 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:03.200970 sshd-session[5889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:03.212704 systemd-logind[1978]: New session 14 of user core. May 27 02:50:03.222340 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 02:50:03.507844 sshd[5891]: Connection closed by 139.178.68.195 port 54402 May 27 02:50:03.508684 sshd-session[5889]: pam_unix(sshd:session): session closed for user core May 27 02:50:03.515549 systemd[1]: sshd@13-172.31.28.205:22-139.178.68.195:54402.service: Deactivated successfully. May 27 02:50:03.521360 systemd[1]: session-14.scope: Deactivated successfully. May 27 02:50:03.526192 systemd-logind[1978]: Session 14 logged out. Waiting for processes to exit. May 27 02:50:03.528440 systemd-logind[1978]: Removed session 14. May 27 02:50:04.877705 containerd[2012]: time="2025-05-27T02:50:04.877508756Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\" id:\"03f89765ed880e2b89ae6824cdb5739aed39a750e2afd61acde461f26117ad1a\" pid:5916 exited_at:{seconds:1748314204 nanos:877062980}" May 27 02:50:08.546401 systemd[1]: Started sshd@14-172.31.28.205:22-139.178.68.195:40700.service - OpenSSH per-connection server daemon (139.178.68.195:40700). May 27 02:50:08.747374 sshd[5928]: Accepted publickey for core from 139.178.68.195 port 40700 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:08.750148 sshd-session[5928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:08.759612 systemd-logind[1978]: New session 15 of user core. May 27 02:50:08.766045 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 02:50:09.032709 sshd[5930]: Connection closed by 139.178.68.195 port 40700 May 27 02:50:09.034006 sshd-session[5928]: pam_unix(sshd:session): session closed for user core May 27 02:50:09.041679 systemd-logind[1978]: Session 15 logged out. Waiting for processes to exit. May 27 02:50:09.043348 systemd[1]: sshd@14-172.31.28.205:22-139.178.68.195:40700.service: Deactivated successfully. May 27 02:50:09.047493 systemd[1]: session-15.scope: Deactivated successfully. May 27 02:50:09.050997 systemd-logind[1978]: Removed session 15. May 27 02:50:10.259633 kubelet[3281]: E0527 02:50:10.259424 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:50:13.265398 containerd[2012]: time="2025-05-27T02:50:13.265066346Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:50:13.448985 containerd[2012]: time="2025-05-27T02:50:13.448910835Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:50:13.451472 containerd[2012]: time="2025-05-27T02:50:13.451374003Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:50:13.451693 containerd[2012]: time="2025-05-27T02:50:13.451537119Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:50:13.453424 kubelet[3281]: E0527 02:50:13.451911 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:50:13.453424 kubelet[3281]: E0527 02:50:13.451978 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:50:13.453424 kubelet[3281]: E0527 02:50:13.452158 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5drb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hkfqv_calico-system(01f59b83-1b36-4034-ac04-b88b89b8506a): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:50:13.455160 kubelet[3281]: E0527 02:50:13.453649 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:50:14.070245 systemd[1]: Started sshd@15-172.31.28.205:22-139.178.68.195:51852.service - OpenSSH per-connection server daemon (139.178.68.195:51852). May 27 02:50:14.283122 sshd[5944]: Accepted publickey for core from 139.178.68.195 port 51852 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:14.286355 sshd-session[5944]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:14.294564 systemd-logind[1978]: New session 16 of user core. May 27 02:50:14.304048 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 02:50:14.561408 sshd[5946]: Connection closed by 139.178.68.195 port 51852 May 27 02:50:14.562105 sshd-session[5944]: pam_unix(sshd:session): session closed for user core May 27 02:50:14.568734 systemd[1]: sshd@15-172.31.28.205:22-139.178.68.195:51852.service: Deactivated successfully. May 27 02:50:14.572752 systemd[1]: session-16.scope: Deactivated successfully. May 27 02:50:14.574907 systemd-logind[1978]: Session 16 logged out. Waiting for processes to exit. May 27 02:50:14.578883 systemd-logind[1978]: Removed session 16. May 27 02:50:14.601833 systemd[1]: Started sshd@16-172.31.28.205:22-139.178.68.195:51864.service - OpenSSH per-connection server daemon (139.178.68.195:51864). May 27 02:50:14.800558 sshd[5958]: Accepted publickey for core from 139.178.68.195 port 51864 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:14.803453 sshd-session[5958]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:14.811692 systemd-logind[1978]: New session 17 of user core. May 27 02:50:14.824062 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 02:50:15.360673 sshd[5960]: Connection closed by 139.178.68.195 port 51864 May 27 02:50:15.362503 sshd-session[5958]: pam_unix(sshd:session): session closed for user core May 27 02:50:15.369644 systemd[1]: sshd@16-172.31.28.205:22-139.178.68.195:51864.service: Deactivated successfully. May 27 02:50:15.375260 systemd[1]: session-17.scope: Deactivated successfully. May 27 02:50:15.381000 systemd-logind[1978]: Session 17 logged out. Waiting for processes to exit. May 27 02:50:15.395987 systemd-logind[1978]: Removed session 17. May 27 02:50:15.398217 systemd[1]: Started sshd@17-172.31.28.205:22-139.178.68.195:51872.service - OpenSSH per-connection server daemon (139.178.68.195:51872). May 27 02:50:15.621146 sshd[5969]: Accepted publickey for core from 139.178.68.195 port 51872 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:15.624007 sshd-session[5969]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:15.637993 systemd-logind[1978]: New session 18 of user core. May 27 02:50:15.645392 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 02:50:17.057800 sshd[5971]: Connection closed by 139.178.68.195 port 51872 May 27 02:50:17.058861 sshd-session[5969]: pam_unix(sshd:session): session closed for user core May 27 02:50:17.068746 systemd[1]: sshd@17-172.31.28.205:22-139.178.68.195:51872.service: Deactivated successfully. May 27 02:50:17.075546 systemd[1]: session-18.scope: Deactivated successfully. May 27 02:50:17.079567 systemd-logind[1978]: Session 18 logged out. Waiting for processes to exit. May 27 02:50:17.100342 systemd[1]: Started sshd@18-172.31.28.205:22-139.178.68.195:51880.service - OpenSSH per-connection server daemon (139.178.68.195:51880). May 27 02:50:17.103692 systemd-logind[1978]: Removed session 18. May 27 02:50:17.303109 sshd[5987]: Accepted publickey for core from 139.178.68.195 port 51880 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:17.306327 sshd-session[5987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:17.316047 systemd-logind[1978]: New session 19 of user core. May 27 02:50:17.323055 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 02:50:17.868675 sshd[5990]: Connection closed by 139.178.68.195 port 51880 May 27 02:50:17.869367 sshd-session[5987]: pam_unix(sshd:session): session closed for user core May 27 02:50:17.879172 systemd[1]: sshd@18-172.31.28.205:22-139.178.68.195:51880.service: Deactivated successfully. May 27 02:50:17.886152 systemd[1]: session-19.scope: Deactivated successfully. May 27 02:50:17.889205 systemd-logind[1978]: Session 19 logged out. Waiting for processes to exit. May 27 02:50:17.908236 systemd[1]: Started sshd@19-172.31.28.205:22-139.178.68.195:51884.service - OpenSSH per-connection server daemon (139.178.68.195:51884). May 27 02:50:17.910441 systemd-logind[1978]: Removed session 19. May 27 02:50:18.112522 sshd[6000]: Accepted publickey for core from 139.178.68.195 port 51884 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:18.115155 sshd-session[6000]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:18.123463 systemd-logind[1978]: New session 20 of user core. May 27 02:50:18.133064 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 02:50:18.393213 sshd[6002]: Connection closed by 139.178.68.195 port 51884 May 27 02:50:18.394021 sshd-session[6000]: pam_unix(sshd:session): session closed for user core May 27 02:50:18.401438 systemd[1]: sshd@19-172.31.28.205:22-139.178.68.195:51884.service: Deactivated successfully. May 27 02:50:18.404744 systemd[1]: session-20.scope: Deactivated successfully. May 27 02:50:18.409156 systemd-logind[1978]: Session 20 logged out. Waiting for processes to exit. May 27 02:50:18.413496 systemd-logind[1978]: Removed session 20. May 27 02:50:19.735650 containerd[2012]: time="2025-05-27T02:50:19.735566482Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\" id:\"4a3fa98c255e1c5d648b74b453c2dfaacecb47d6a687358b1680dce44d4fd557\" pid:6025 exited_at:{seconds:1748314219 nanos:735080218}" May 27 02:50:21.257926 kubelet[3281]: E0527 02:50:21.257761 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:50:23.439650 systemd[1]: Started sshd@20-172.31.28.205:22-139.178.68.195:51900.service - OpenSSH per-connection server daemon (139.178.68.195:51900). May 27 02:50:23.646263 sshd[6038]: Accepted publickey for core from 139.178.68.195 port 51900 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:23.649663 sshd-session[6038]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:23.658852 systemd-logind[1978]: New session 21 of user core. May 27 02:50:23.667042 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 02:50:23.923816 sshd[6040]: Connection closed by 139.178.68.195 port 51900 May 27 02:50:23.924297 sshd-session[6038]: pam_unix(sshd:session): session closed for user core May 27 02:50:23.932454 systemd-logind[1978]: Session 21 logged out. Waiting for processes to exit. May 27 02:50:23.934404 systemd[1]: sshd@20-172.31.28.205:22-139.178.68.195:51900.service: Deactivated successfully. May 27 02:50:23.941365 systemd[1]: session-21.scope: Deactivated successfully. May 27 02:50:23.944391 systemd-logind[1978]: Removed session 21. May 27 02:50:24.105644 containerd[2012]: time="2025-05-27T02:50:24.105580980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\" id:\"777d5e5200f06ec196fbaefd9972b127d9b993aed2d211b1c80bcb6c592d3f96\" pid:6062 exited_at:{seconds:1748314224 nanos:104248104}" May 27 02:50:26.257171 kubelet[3281]: E0527 02:50:26.257083 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:50:28.964117 systemd[1]: Started sshd@21-172.31.28.205:22-139.178.68.195:34026.service - OpenSSH per-connection server daemon (139.178.68.195:34026). May 27 02:50:29.177466 sshd[6074]: Accepted publickey for core from 139.178.68.195 port 34026 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:29.181739 sshd-session[6074]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:29.193492 systemd-logind[1978]: New session 22 of user core. May 27 02:50:29.202095 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 02:50:29.542804 sshd[6076]: Connection closed by 139.178.68.195 port 34026 May 27 02:50:29.541228 sshd-session[6074]: pam_unix(sshd:session): session closed for user core May 27 02:50:29.552128 systemd[1]: sshd@21-172.31.28.205:22-139.178.68.195:34026.service: Deactivated successfully. May 27 02:50:29.558192 systemd[1]: session-22.scope: Deactivated successfully. May 27 02:50:29.561161 systemd-logind[1978]: Session 22 logged out. Waiting for processes to exit. May 27 02:50:29.567559 systemd-logind[1978]: Removed session 22. May 27 02:50:33.259392 kubelet[3281]: E0527 02:50:33.259183 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:50:34.584623 systemd[1]: Started sshd@22-172.31.28.205:22-139.178.68.195:42818.service - OpenSSH per-connection server daemon (139.178.68.195:42818). May 27 02:50:34.810410 sshd[6088]: Accepted publickey for core from 139.178.68.195 port 42818 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:34.817012 sshd-session[6088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:34.838937 systemd-logind[1978]: New session 23 of user core. May 27 02:50:34.852046 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 02:50:35.077433 containerd[2012]: time="2025-05-27T02:50:35.077366530Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\" id:\"b8502aa032c8f8608471e487b1341cd5eb0243cabc1168d29810a5219eaa5c1a\" pid:6103 exited_at:{seconds:1748314235 nanos:75390478}" May 27 02:50:35.187326 sshd[6098]: Connection closed by 139.178.68.195 port 42818 May 27 02:50:35.188537 sshd-session[6088]: pam_unix(sshd:session): session closed for user core May 27 02:50:35.200648 systemd[1]: sshd@22-172.31.28.205:22-139.178.68.195:42818.service: Deactivated successfully. May 27 02:50:35.209372 systemd[1]: session-23.scope: Deactivated successfully. May 27 02:50:35.215445 systemd-logind[1978]: Session 23 logged out. Waiting for processes to exit. May 27 02:50:35.219413 systemd-logind[1978]: Removed session 23. May 27 02:50:39.258491 kubelet[3281]: E0527 02:50:39.257527 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:50:40.233201 systemd[1]: Started sshd@23-172.31.28.205:22-139.178.68.195:42822.service - OpenSSH per-connection server daemon (139.178.68.195:42822). May 27 02:50:40.463871 sshd[6124]: Accepted publickey for core from 139.178.68.195 port 42822 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:40.465841 sshd-session[6124]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:40.478673 systemd-logind[1978]: New session 24 of user core. May 27 02:50:40.487209 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 02:50:40.832362 sshd[6132]: Connection closed by 139.178.68.195 port 42822 May 27 02:50:40.832226 sshd-session[6124]: pam_unix(sshd:session): session closed for user core May 27 02:50:40.840642 systemd[1]: sshd@23-172.31.28.205:22-139.178.68.195:42822.service: Deactivated successfully. May 27 02:50:40.846574 systemd[1]: session-24.scope: Deactivated successfully. May 27 02:50:40.853534 systemd-logind[1978]: Session 24 logged out. Waiting for processes to exit. May 27 02:50:40.859276 systemd-logind[1978]: Removed session 24. May 27 02:50:45.258342 containerd[2012]: time="2025-05-27T02:50:45.258275337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:50:45.462055 containerd[2012]: time="2025-05-27T02:50:45.461932642Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:50:45.465061 containerd[2012]: time="2025-05-27T02:50:45.464544802Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:50:45.465061 containerd[2012]: time="2025-05-27T02:50:45.464686006Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:50:45.465262 kubelet[3281]: E0527 02:50:45.465028 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:50:45.466350 kubelet[3281]: E0527 02:50:45.465638 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:50:45.466418 kubelet[3281]: E0527 02:50:45.466354 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1327877524034c62a2710c5fbd6feb5f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:50:45.469835 containerd[2012]: time="2025-05-27T02:50:45.469760842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:50:45.661229 containerd[2012]: time="2025-05-27T02:50:45.660541511Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:50:45.662909 containerd[2012]: time="2025-05-27T02:50:45.662819411Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:50:45.663185 containerd[2012]: time="2025-05-27T02:50:45.662818379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:50:45.663346 kubelet[3281]: E0527 02:50:45.663154 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:50:45.663346 kubelet[3281]: E0527 02:50:45.663217 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:50:45.664190 kubelet[3281]: E0527 02:50:45.663371 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4v5h6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6888687c66-w5ggn_calico-system(e1a2f19c-cbfe-493b-984d-c722792ee820): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:50:45.665100 kubelet[3281]: E0527 02:50:45.664500 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:50:45.874436 systemd[1]: Started sshd@24-172.31.28.205:22-139.178.68.195:52644.service - OpenSSH per-connection server daemon (139.178.68.195:52644). May 27 02:50:46.078588 sshd[6148]: Accepted publickey for core from 139.178.68.195 port 52644 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:46.083054 sshd-session[6148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:46.096248 systemd-logind[1978]: New session 25 of user core. May 27 02:50:46.104545 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 02:50:46.439989 sshd[6150]: Connection closed by 139.178.68.195 port 52644 May 27 02:50:46.440081 sshd-session[6148]: pam_unix(sshd:session): session closed for user core May 27 02:50:46.454653 systemd[1]: sshd@24-172.31.28.205:22-139.178.68.195:52644.service: Deactivated successfully. May 27 02:50:46.461255 systemd[1]: session-25.scope: Deactivated successfully. May 27 02:50:46.465887 systemd-logind[1978]: Session 25 logged out. Waiting for processes to exit. May 27 02:50:46.469924 systemd-logind[1978]: Removed session 25. May 27 02:50:49.819250 containerd[2012]: time="2025-05-27T02:50:49.819181395Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\" id:\"7ebbcf094ca754c92d3ffd1d389d860d01b890b5ccad6a24d611766479e3cbfb\" pid:6174 exited_at:{seconds:1748314249 nanos:818363163}" May 27 02:50:51.487619 systemd[1]: Started sshd@25-172.31.28.205:22-139.178.68.195:52656.service - OpenSSH per-connection server daemon (139.178.68.195:52656). May 27 02:50:51.716471 sshd[6186]: Accepted publickey for core from 139.178.68.195 port 52656 ssh2: RSA SHA256:wB7DXbDl54cvXXypqfSM11xNUGSlmUqWSyH8J9Yllv0 May 27 02:50:51.719909 sshd-session[6186]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:51.730402 systemd-logind[1978]: New session 26 of user core. May 27 02:50:51.738115 systemd[1]: Started session-26.scope - Session 26 of User core. May 27 02:50:52.049719 sshd[6188]: Connection closed by 139.178.68.195 port 52656 May 27 02:50:52.052216 sshd-session[6186]: pam_unix(sshd:session): session closed for user core May 27 02:50:52.061485 systemd-logind[1978]: Session 26 logged out. Waiting for processes to exit. May 27 02:50:52.062392 systemd[1]: sshd@25-172.31.28.205:22-139.178.68.195:52656.service: Deactivated successfully. May 27 02:50:52.068438 systemd[1]: session-26.scope: Deactivated successfully. May 27 02:50:52.072396 systemd-logind[1978]: Removed session 26. May 27 02:50:53.257920 kubelet[3281]: E0527 02:50:53.257666 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:51:00.258324 kubelet[3281]: E0527 02:51:00.258121 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:51:04.873423 containerd[2012]: time="2025-05-27T02:51:04.873338178Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\" id:\"ea78d2ee87a03bc40be4b45beee836bcec823d10476e41d4c29709f3c004262d\" pid:6232 exited_at:{seconds:1748314264 nanos:872600442}" May 27 02:51:05.257631 containerd[2012]: time="2025-05-27T02:51:05.257571844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:51:05.464528 containerd[2012]: time="2025-05-27T02:51:05.464450045Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:51:05.466546 containerd[2012]: time="2025-05-27T02:51:05.466433285Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:51:05.466546 containerd[2012]: time="2025-05-27T02:51:05.466509257Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:51:05.466898 kubelet[3281]: E0527 02:51:05.466730 3281 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:51:05.466898 kubelet[3281]: E0527 02:51:05.466824 3281 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:51:05.467559 kubelet[3281]: E0527 02:51:05.467157 3281 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5drb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hkfqv_calico-system(01f59b83-1b36-4034-ac04-b88b89b8506a): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:51:05.468534 kubelet[3281]: E0527 02:51:05.468468 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:51:05.545366 systemd[1]: cri-containerd-148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac.scope: Deactivated successfully. May 27 02:51:05.545945 systemd[1]: cri-containerd-148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac.scope: Consumed 18.710s CPU time, 94.2M memory peak. May 27 02:51:05.553457 containerd[2012]: time="2025-05-27T02:51:05.553407773Z" level=info msg="received exit event container_id:\"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\" id:\"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\" pid:3604 exit_status:1 exited_at:{seconds:1748314265 nanos:553033673}" May 27 02:51:05.554211 containerd[2012]: time="2025-05-27T02:51:05.554075345Z" level=info msg="TaskExit event in podsandbox handler container_id:\"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\" id:\"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\" pid:3604 exit_status:1 exited_at:{seconds:1748314265 nanos:553033673}" May 27 02:51:05.596852 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac-rootfs.mount: Deactivated successfully. May 27 02:51:06.129788 kubelet[3281]: I0527 02:51:06.129742 3281 scope.go:117] "RemoveContainer" containerID="148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac" May 27 02:51:06.159337 containerd[2012]: time="2025-05-27T02:51:06.159259096Z" level=info msg="CreateContainer within sandbox \"9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" May 27 02:51:06.180794 containerd[2012]: time="2025-05-27T02:51:06.177712325Z" level=info msg="Container 8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2: CDI devices from CRI Config.CDIDevices: []" May 27 02:51:06.193408 containerd[2012]: time="2025-05-27T02:51:06.193346501Z" level=info msg="CreateContainer within sandbox \"9401e3567d9e274441b0c21a1ad5c87ad2d604bb170ac0ec0649dcb765ad19da\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2\"" May 27 02:51:06.200345 containerd[2012]: time="2025-05-27T02:51:06.200276405Z" level=info msg="StartContainer for \"8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2\"" May 27 02:51:06.201920 containerd[2012]: time="2025-05-27T02:51:06.201831101Z" level=info msg="connecting to shim 8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2" address="unix:///run/containerd/s/ce32d09926bbaaee38fb866e55ebeaf036e7cd4c414b369fe43cb25b036f6f6d" protocol=ttrpc version=3 May 27 02:51:06.245083 systemd[1]: Started cri-containerd-8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2.scope - libcontainer container 8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2. May 27 02:51:06.305399 containerd[2012]: time="2025-05-27T02:51:06.305234453Z" level=info msg="StartContainer for \"8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2\" returns successfully" May 27 02:51:06.964942 systemd[1]: cri-containerd-378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb.scope: Deactivated successfully. May 27 02:51:06.968065 systemd[1]: cri-containerd-378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb.scope: Consumed 5.179s CPU time, 61.9M memory peak, 128K read from disk. May 27 02:51:06.980485 containerd[2012]: time="2025-05-27T02:51:06.980420301Z" level=info msg="received exit event container_id:\"378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb\" id:\"378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb\" pid:3116 exit_status:1 exited_at:{seconds:1748314266 nanos:979347897}" May 27 02:51:06.980861 containerd[2012]: time="2025-05-27T02:51:06.980806089Z" level=info msg="TaskExit event in podsandbox handler container_id:\"378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb\" id:\"378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb\" pid:3116 exit_status:1 exited_at:{seconds:1748314266 nanos:979347897}" May 27 02:51:07.024708 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb-rootfs.mount: Deactivated successfully. May 27 02:51:07.142198 kubelet[3281]: I0527 02:51:07.142133 3281 scope.go:117] "RemoveContainer" containerID="378c1b678befa560897250fcaeb60cd38293ced5dc93e1d61be601f642cdfadb" May 27 02:51:07.146356 containerd[2012]: time="2025-05-27T02:51:07.146269025Z" level=info msg="CreateContainer within sandbox \"9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" May 27 02:51:07.167805 containerd[2012]: time="2025-05-27T02:51:07.165957365Z" level=info msg="Container bc71fa84c8e99b824ab356bb62e72ac10fa9f66a8c30136cc8eb402f9a054e76: CDI devices from CRI Config.CDIDevices: []" May 27 02:51:07.186055 containerd[2012]: time="2025-05-27T02:51:07.186001938Z" level=info msg="CreateContainer within sandbox \"9fcde17a0530e58e75bde8018707680f11012cdb1e30230cfe2c034aec0f74fb\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"bc71fa84c8e99b824ab356bb62e72ac10fa9f66a8c30136cc8eb402f9a054e76\"" May 27 02:51:07.187182 containerd[2012]: time="2025-05-27T02:51:07.187119702Z" level=info msg="StartContainer for \"bc71fa84c8e99b824ab356bb62e72ac10fa9f66a8c30136cc8eb402f9a054e76\"" May 27 02:51:07.189226 containerd[2012]: time="2025-05-27T02:51:07.189155694Z" level=info msg="connecting to shim bc71fa84c8e99b824ab356bb62e72ac10fa9f66a8c30136cc8eb402f9a054e76" address="unix:///run/containerd/s/cb3a11e1cb7d7551724d9472b218c3ba88a0ee1a5a28aa0d1bf79e1b4bd84483" protocol=ttrpc version=3 May 27 02:51:07.232091 systemd[1]: Started cri-containerd-bc71fa84c8e99b824ab356bb62e72ac10fa9f66a8c30136cc8eb402f9a054e76.scope - libcontainer container bc71fa84c8e99b824ab356bb62e72ac10fa9f66a8c30136cc8eb402f9a054e76. May 27 02:51:07.321391 containerd[2012]: time="2025-05-27T02:51:07.321321738Z" level=info msg="StartContainer for \"bc71fa84c8e99b824ab356bb62e72ac10fa9f66a8c30136cc8eb402f9a054e76\" returns successfully" May 27 02:51:08.617417 kubelet[3281]: E0527 02:51:08.616913 3281 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.205:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-205?timeout=10s\": context deadline exceeded" May 27 02:51:10.859069 systemd[1]: cri-containerd-184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf.scope: Deactivated successfully. May 27 02:51:10.860248 systemd[1]: cri-containerd-184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf.scope: Consumed 4.620s CPU time, 21M memory peak, 136K read from disk. May 27 02:51:10.866418 containerd[2012]: time="2025-05-27T02:51:10.866356416Z" level=info msg="TaskExit event in podsandbox handler container_id:\"184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf\" id:\"184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf\" pid:3125 exit_status:1 exited_at:{seconds:1748314270 nanos:865842540}" May 27 02:51:10.867241 containerd[2012]: time="2025-05-27T02:51:10.866604888Z" level=info msg="received exit event container_id:\"184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf\" id:\"184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf\" pid:3125 exit_status:1 exited_at:{seconds:1748314270 nanos:865842540}" May 27 02:51:10.908839 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf-rootfs.mount: Deactivated successfully. May 27 02:51:11.174547 kubelet[3281]: I0527 02:51:11.173825 3281 scope.go:117] "RemoveContainer" containerID="184d38ab91666e98e3b76b3aa42ffb592fc29aa6320d2365b662fa7dd589adaf" May 27 02:51:11.178661 containerd[2012]: time="2025-05-27T02:51:11.178583121Z" level=info msg="CreateContainer within sandbox \"bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" May 27 02:51:11.197799 containerd[2012]: time="2025-05-27T02:51:11.197338233Z" level=info msg="Container ecdd549ccf2127dcecb206e0fcb1782d06bb8575535cc5bd7fbb5d8db05b0fde: CDI devices from CRI Config.CDIDevices: []" May 27 02:51:11.217104 containerd[2012]: time="2025-05-27T02:51:11.217019926Z" level=info msg="CreateContainer within sandbox \"bb2c63e27bb233ad89f478c2d6b6386efd2c9105d87215da3ca9908d48e6394d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"ecdd549ccf2127dcecb206e0fcb1782d06bb8575535cc5bd7fbb5d8db05b0fde\"" May 27 02:51:11.218383 containerd[2012]: time="2025-05-27T02:51:11.218137258Z" level=info msg="StartContainer for \"ecdd549ccf2127dcecb206e0fcb1782d06bb8575535cc5bd7fbb5d8db05b0fde\"" May 27 02:51:11.220112 containerd[2012]: time="2025-05-27T02:51:11.220052302Z" level=info msg="connecting to shim ecdd549ccf2127dcecb206e0fcb1782d06bb8575535cc5bd7fbb5d8db05b0fde" address="unix:///run/containerd/s/65cd7e68e4433789da97c612f867eea37d17e3b9ccf917229870c2175b45eeaa" protocol=ttrpc version=3 May 27 02:51:11.260175 systemd[1]: Started cri-containerd-ecdd549ccf2127dcecb206e0fcb1782d06bb8575535cc5bd7fbb5d8db05b0fde.scope - libcontainer container ecdd549ccf2127dcecb206e0fcb1782d06bb8575535cc5bd7fbb5d8db05b0fde. May 27 02:51:11.268075 kubelet[3281]: E0527 02:51:11.267977 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820" May 27 02:51:11.342358 containerd[2012]: time="2025-05-27T02:51:11.342299434Z" level=info msg="StartContainer for \"ecdd549ccf2127dcecb206e0fcb1782d06bb8575535cc5bd7fbb5d8db05b0fde\" returns successfully" May 27 02:51:17.777147 systemd[1]: cri-containerd-8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2.scope: Deactivated successfully. May 27 02:51:17.780494 containerd[2012]: time="2025-05-27T02:51:17.780361926Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2\" id:\"8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2\" pid:6267 exit_status:1 exited_at:{seconds:1748314277 nanos:778846794}" May 27 02:51:17.781349 containerd[2012]: time="2025-05-27T02:51:17.780499542Z" level=info msg="received exit event container_id:\"8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2\" id:\"8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2\" pid:6267 exit_status:1 exited_at:{seconds:1748314277 nanos:778846794}" May 27 02:51:17.819799 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2-rootfs.mount: Deactivated successfully. May 27 02:51:18.205396 kubelet[3281]: I0527 02:51:18.205265 3281 scope.go:117] "RemoveContainer" containerID="148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac" May 27 02:51:18.207006 kubelet[3281]: I0527 02:51:18.206932 3281 scope.go:117] "RemoveContainer" containerID="8a46f3d0f194970b085b7c6d117dd89c4f4fc3c7de968036fcf3a5542fed34d2" May 27 02:51:18.208404 kubelet[3281]: E0527 02:51:18.208205 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-844669ff44-spfvs_tigera-operator(f85d0e03-f590-48f1-ba05-542041e1ebc3)\"" pod="tigera-operator/tigera-operator-844669ff44-spfvs" podUID="f85d0e03-f590-48f1-ba05-542041e1ebc3" May 27 02:51:18.209661 containerd[2012]: time="2025-05-27T02:51:18.209589628Z" level=info msg="RemoveContainer for \"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\"" May 27 02:51:18.219807 containerd[2012]: time="2025-05-27T02:51:18.219328780Z" level=info msg="RemoveContainer for \"148accd9035c4d6c129dc3c4f783dadbddeff8c8fbc5cd51a251418c963ba5ac\" returns successfully" May 27 02:51:18.255857 kubelet[3281]: E0527 02:51:18.255756 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hkfqv" podUID="01f59b83-1b36-4034-ac04-b88b89b8506a" May 27 02:51:18.618290 kubelet[3281]: E0527 02:51:18.618206 3281 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.205:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-205?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" May 27 02:51:19.726893 containerd[2012]: time="2025-05-27T02:51:19.726822284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8837e98ab5de6363ffc6e5280f6c56c8d6cb6734edc7b51ac534e6206b8fc0e\" id:\"3aca428cc869121015d875e53006d606f299465846def525c377f42b544be86d\" pid:6408 exited_at:{seconds:1748314279 nanos:726372452}" May 27 02:51:24.105207 containerd[2012]: time="2025-05-27T02:51:24.105079702Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7151b369ace0ba5c04db2308ce2ddfe90bd1f246ddd478eaa064be58a6c28437\" id:\"919f19be21d164293e150a859f02a2e8b7b040f82ce542054e950f93007955ab\" pid:6434 exit_status:1 exited_at:{seconds:1748314284 nanos:104662534}" May 27 02:51:26.257353 kubelet[3281]: E0527 02:51:26.257237 3281 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6888687c66-w5ggn" podUID="e1a2f19c-cbfe-493b-984d-c722792ee820"