Mar 25 01:15:53.156065 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Mar 25 01:15:53.156110 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Mon Mar 24 23:39:14 -00 2025 Mar 25 01:15:53.156135 kernel: KASLR disabled due to lack of seed Mar 25 01:15:53.156151 kernel: efi: EFI v2.7 by EDK II Mar 25 01:15:53.156165 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7a736a98 MEMRESERVE=0x78551598 Mar 25 01:15:53.156180 kernel: secureboot: Secure boot disabled Mar 25 01:15:53.156197 kernel: ACPI: Early table checksum verification disabled Mar 25 01:15:53.156211 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Mar 25 01:15:53.156250 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Mar 25 01:15:53.156270 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Mar 25 01:15:53.156292 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Mar 25 01:15:53.156308 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Mar 25 01:15:53.156327 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Mar 25 01:15:53.156343 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Mar 25 01:15:53.156361 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Mar 25 01:15:53.156380 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Mar 25 01:15:53.156397 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Mar 25 01:15:53.156413 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Mar 25 01:15:53.156428 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Mar 25 01:15:53.156444 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Mar 25 01:15:53.156460 kernel: printk: bootconsole [uart0] enabled Mar 25 01:15:53.156475 kernel: NUMA: Failed to initialise from firmware Mar 25 01:15:53.156492 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Mar 25 01:15:53.156508 kernel: NUMA: NODE_DATA [mem 0x4b583f800-0x4b5844fff] Mar 25 01:15:53.156523 kernel: Zone ranges: Mar 25 01:15:53.156539 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 25 01:15:53.156559 kernel: DMA32 empty Mar 25 01:15:53.156575 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Mar 25 01:15:53.156591 kernel: Movable zone start for each node Mar 25 01:15:53.156606 kernel: Early memory node ranges Mar 25 01:15:53.156622 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Mar 25 01:15:53.156637 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Mar 25 01:15:53.156653 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Mar 25 01:15:53.156669 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Mar 25 01:15:53.156684 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Mar 25 01:15:53.156700 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Mar 25 01:15:53.156716 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Mar 25 01:15:53.156731 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Mar 25 01:15:53.156751 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Mar 25 01:15:53.156768 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Mar 25 01:15:53.156790 kernel: psci: probing for conduit method from ACPI. Mar 25 01:15:53.156807 kernel: psci: PSCIv1.0 detected in firmware. Mar 25 01:15:53.156823 kernel: psci: Using standard PSCI v0.2 function IDs Mar 25 01:15:53.156844 kernel: psci: Trusted OS migration not required Mar 25 01:15:53.156861 kernel: psci: SMC Calling Convention v1.1 Mar 25 01:15:53.156877 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 25 01:15:53.156893 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 25 01:15:53.156910 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 25 01:15:53.156927 kernel: Detected PIPT I-cache on CPU0 Mar 25 01:15:53.156943 kernel: CPU features: detected: GIC system register CPU interface Mar 25 01:15:53.156960 kernel: CPU features: detected: Spectre-v2 Mar 25 01:15:53.156976 kernel: CPU features: detected: Spectre-v3a Mar 25 01:15:53.156992 kernel: CPU features: detected: Spectre-BHB Mar 25 01:15:53.157009 kernel: CPU features: detected: ARM erratum 1742098 Mar 25 01:15:53.157025 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Mar 25 01:15:53.157046 kernel: alternatives: applying boot alternatives Mar 25 01:15:53.157065 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:15:53.157082 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:15:53.157099 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 25 01:15:53.157116 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:15:53.157132 kernel: Fallback order for Node 0: 0 Mar 25 01:15:53.157149 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Mar 25 01:15:53.157165 kernel: Policy zone: Normal Mar 25 01:15:53.157182 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:15:53.157200 kernel: software IO TLB: area num 2. Mar 25 01:15:53.163029 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Mar 25 01:15:53.163070 kernel: Memory: 3821112K/4030464K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38464K init, 897K bss, 209352K reserved, 0K cma-reserved) Mar 25 01:15:53.163089 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 25 01:15:53.163106 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:15:53.163124 kernel: rcu: RCU event tracing is enabled. Mar 25 01:15:53.163141 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 25 01:15:53.163158 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:15:53.163175 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:15:53.163192 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:15:53.163209 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 25 01:15:53.163268 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 25 01:15:53.163381 kernel: GICv3: 96 SPIs implemented Mar 25 01:15:53.163400 kernel: GICv3: 0 Extended SPIs implemented Mar 25 01:15:53.163417 kernel: Root IRQ handler: gic_handle_irq Mar 25 01:15:53.163433 kernel: GICv3: GICv3 features: 16 PPIs Mar 25 01:15:53.163450 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Mar 25 01:15:53.163467 kernel: ITS [mem 0x10080000-0x1009ffff] Mar 25 01:15:53.163484 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000b0000 (indirect, esz 8, psz 64K, shr 1) Mar 25 01:15:53.163501 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000c0000 (flat, esz 8, psz 64K, shr 1) Mar 25 01:15:53.163518 kernel: GICv3: using LPI property table @0x00000004000d0000 Mar 25 01:15:53.163535 kernel: ITS: Using hypervisor restricted LPI range [128] Mar 25 01:15:53.163551 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000e0000 Mar 25 01:15:53.163568 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:15:53.163590 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Mar 25 01:15:53.163607 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Mar 25 01:15:53.163623 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Mar 25 01:15:53.163640 kernel: Console: colour dummy device 80x25 Mar 25 01:15:53.163657 kernel: printk: console [tty1] enabled Mar 25 01:15:53.163674 kernel: ACPI: Core revision 20230628 Mar 25 01:15:53.163691 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Mar 25 01:15:53.163708 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:15:53.163725 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:15:53.163742 kernel: landlock: Up and running. Mar 25 01:15:53.163763 kernel: SELinux: Initializing. Mar 25 01:15:53.163781 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:15:53.163798 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:15:53.163815 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:15:53.163832 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:15:53.163849 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:15:53.163866 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:15:53.163883 kernel: Platform MSI: ITS@0x10080000 domain created Mar 25 01:15:53.163904 kernel: PCI/MSI: ITS@0x10080000 domain created Mar 25 01:15:53.163921 kernel: Remapping and enabling EFI services. Mar 25 01:15:53.163938 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:15:53.163954 kernel: Detected PIPT I-cache on CPU1 Mar 25 01:15:53.163971 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Mar 25 01:15:53.163988 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000f0000 Mar 25 01:15:53.164005 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Mar 25 01:15:53.164022 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:15:53.164038 kernel: SMP: Total of 2 processors activated. Mar 25 01:15:53.164055 kernel: CPU features: detected: 32-bit EL0 Support Mar 25 01:15:53.164076 kernel: CPU features: detected: 32-bit EL1 Support Mar 25 01:15:53.164093 kernel: CPU features: detected: CRC32 instructions Mar 25 01:15:53.164121 kernel: CPU: All CPU(s) started at EL1 Mar 25 01:15:53.164143 kernel: alternatives: applying system-wide alternatives Mar 25 01:15:53.164160 kernel: devtmpfs: initialized Mar 25 01:15:53.164178 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:15:53.164195 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 25 01:15:53.164213 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:15:53.164252 kernel: SMBIOS 3.0.0 present. Mar 25 01:15:53.164279 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Mar 25 01:15:53.164297 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:15:53.164315 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 25 01:15:53.164333 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 25 01:15:53.164351 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 25 01:15:53.164368 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:15:53.164386 kernel: audit: type=2000 audit(0.218:1): state=initialized audit_enabled=0 res=1 Mar 25 01:15:53.164408 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:15:53.164426 kernel: cpuidle: using governor menu Mar 25 01:15:53.164443 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 25 01:15:53.164461 kernel: ASID allocator initialised with 65536 entries Mar 25 01:15:53.164479 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:15:53.164496 kernel: Serial: AMBA PL011 UART driver Mar 25 01:15:53.164514 kernel: Modules: 17728 pages in range for non-PLT usage Mar 25 01:15:53.164532 kernel: Modules: 509248 pages in range for PLT usage Mar 25 01:15:53.164549 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:15:53.164572 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:15:53.164634 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 25 01:15:53.164654 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 25 01:15:53.164672 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:15:53.164690 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:15:53.164709 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 25 01:15:53.164726 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 25 01:15:53.164745 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:15:53.164762 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:15:53.164787 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:15:53.164805 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:15:53.164822 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:15:53.164840 kernel: ACPI: Interpreter enabled Mar 25 01:15:53.164857 kernel: ACPI: Using GIC for interrupt routing Mar 25 01:15:53.164875 kernel: ACPI: MCFG table detected, 1 entries Mar 25 01:15:53.164892 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Mar 25 01:15:53.165193 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:15:53.168066 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:15:53.171390 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:15:53.171627 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Mar 25 01:15:53.171829 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Mar 25 01:15:53.171855 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Mar 25 01:15:53.171874 kernel: acpiphp: Slot [1] registered Mar 25 01:15:53.171893 kernel: acpiphp: Slot [2] registered Mar 25 01:15:53.171911 kernel: acpiphp: Slot [3] registered Mar 25 01:15:53.171937 kernel: acpiphp: Slot [4] registered Mar 25 01:15:53.171955 kernel: acpiphp: Slot [5] registered Mar 25 01:15:53.171973 kernel: acpiphp: Slot [6] registered Mar 25 01:15:53.171991 kernel: acpiphp: Slot [7] registered Mar 25 01:15:53.172009 kernel: acpiphp: Slot [8] registered Mar 25 01:15:53.172026 kernel: acpiphp: Slot [9] registered Mar 25 01:15:53.172043 kernel: acpiphp: Slot [10] registered Mar 25 01:15:53.172061 kernel: acpiphp: Slot [11] registered Mar 25 01:15:53.172078 kernel: acpiphp: Slot [12] registered Mar 25 01:15:53.172096 kernel: acpiphp: Slot [13] registered Mar 25 01:15:53.172118 kernel: acpiphp: Slot [14] registered Mar 25 01:15:53.172136 kernel: acpiphp: Slot [15] registered Mar 25 01:15:53.172153 kernel: acpiphp: Slot [16] registered Mar 25 01:15:53.172170 kernel: acpiphp: Slot [17] registered Mar 25 01:15:53.172188 kernel: acpiphp: Slot [18] registered Mar 25 01:15:53.172205 kernel: acpiphp: Slot [19] registered Mar 25 01:15:53.172250 kernel: acpiphp: Slot [20] registered Mar 25 01:15:53.172274 kernel: acpiphp: Slot [21] registered Mar 25 01:15:53.172293 kernel: acpiphp: Slot [22] registered Mar 25 01:15:53.172317 kernel: acpiphp: Slot [23] registered Mar 25 01:15:53.172335 kernel: acpiphp: Slot [24] registered Mar 25 01:15:53.172353 kernel: acpiphp: Slot [25] registered Mar 25 01:15:53.172370 kernel: acpiphp: Slot [26] registered Mar 25 01:15:53.172388 kernel: acpiphp: Slot [27] registered Mar 25 01:15:53.172405 kernel: acpiphp: Slot [28] registered Mar 25 01:15:53.172422 kernel: acpiphp: Slot [29] registered Mar 25 01:15:53.172440 kernel: acpiphp: Slot [30] registered Mar 25 01:15:53.172457 kernel: acpiphp: Slot [31] registered Mar 25 01:15:53.172475 kernel: PCI host bridge to bus 0000:00 Mar 25 01:15:53.172707 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Mar 25 01:15:53.172900 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 25 01:15:53.173094 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Mar 25 01:15:53.175681 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Mar 25 01:15:53.175935 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Mar 25 01:15:53.176177 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Mar 25 01:15:53.176902 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Mar 25 01:15:53.177142 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Mar 25 01:15:53.177393 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Mar 25 01:15:53.177605 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 25 01:15:53.177830 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Mar 25 01:15:53.178040 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Mar 25 01:15:53.180434 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Mar 25 01:15:53.180722 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Mar 25 01:15:53.180962 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 25 01:15:53.181190 kernel: pci 0000:00:05.0: BAR 2: assigned [mem 0x80000000-0x800fffff pref] Mar 25 01:15:53.181461 kernel: pci 0000:00:05.0: BAR 4: assigned [mem 0x80100000-0x8010ffff] Mar 25 01:15:53.181680 kernel: pci 0000:00:04.0: BAR 0: assigned [mem 0x80110000-0x80113fff] Mar 25 01:15:53.181892 kernel: pci 0000:00:05.0: BAR 0: assigned [mem 0x80114000-0x80117fff] Mar 25 01:15:53.182105 kernel: pci 0000:00:01.0: BAR 0: assigned [mem 0x80118000-0x80118fff] Mar 25 01:15:53.186317 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Mar 25 01:15:53.186549 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 25 01:15:53.186752 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Mar 25 01:15:53.186777 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 25 01:15:53.186796 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 25 01:15:53.186814 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 25 01:15:53.186832 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 25 01:15:53.186850 kernel: iommu: Default domain type: Translated Mar 25 01:15:53.186880 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 25 01:15:53.186898 kernel: efivars: Registered efivars operations Mar 25 01:15:53.186916 kernel: vgaarb: loaded Mar 25 01:15:53.186934 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 25 01:15:53.186952 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:15:53.186970 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:15:53.186988 kernel: pnp: PnP ACPI init Mar 25 01:15:53.187213 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Mar 25 01:15:53.188328 kernel: pnp: PnP ACPI: found 1 devices Mar 25 01:15:53.188349 kernel: NET: Registered PF_INET protocol family Mar 25 01:15:53.188368 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 01:15:53.188386 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 25 01:15:53.188405 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:15:53.188423 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:15:53.188441 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 25 01:15:53.188460 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 25 01:15:53.188478 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:15:53.188501 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:15:53.188519 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:15:53.188537 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:15:53.188555 kernel: kvm [1]: HYP mode not available Mar 25 01:15:53.188572 kernel: Initialise system trusted keyrings Mar 25 01:15:53.188590 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 25 01:15:53.188609 kernel: Key type asymmetric registered Mar 25 01:15:53.188627 kernel: Asymmetric key parser 'x509' registered Mar 25 01:15:53.188644 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 25 01:15:53.188666 kernel: io scheduler mq-deadline registered Mar 25 01:15:53.188685 kernel: io scheduler kyber registered Mar 25 01:15:53.188702 kernel: io scheduler bfq registered Mar 25 01:15:53.188963 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Mar 25 01:15:53.188992 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 25 01:15:53.189032 kernel: ACPI: button: Power Button [PWRB] Mar 25 01:15:53.189051 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Mar 25 01:15:53.189070 kernel: ACPI: button: Sleep Button [SLPB] Mar 25 01:15:53.189095 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:15:53.189114 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 25 01:15:53.191097 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Mar 25 01:15:53.191140 kernel: printk: console [ttyS0] disabled Mar 25 01:15:53.191160 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Mar 25 01:15:53.191178 kernel: printk: console [ttyS0] enabled Mar 25 01:15:53.191197 kernel: printk: bootconsole [uart0] disabled Mar 25 01:15:53.191215 kernel: thunder_xcv, ver 1.0 Mar 25 01:15:53.191253 kernel: thunder_bgx, ver 1.0 Mar 25 01:15:53.191273 kernel: nicpf, ver 1.0 Mar 25 01:15:53.191302 kernel: nicvf, ver 1.0 Mar 25 01:15:53.191551 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 25 01:15:53.191753 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-25T01:15:52 UTC (1742865352) Mar 25 01:15:53.191782 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:15:53.191803 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Mar 25 01:15:53.191821 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 25 01:15:53.191839 kernel: watchdog: Hard watchdog permanently disabled Mar 25 01:15:53.191863 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:15:53.191881 kernel: Segment Routing with IPv6 Mar 25 01:15:53.191900 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:15:53.191917 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:15:53.191935 kernel: Key type dns_resolver registered Mar 25 01:15:53.191953 kernel: registered taskstats version 1 Mar 25 01:15:53.191971 kernel: Loading compiled-in X.509 certificates Mar 25 01:15:53.191988 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: ed4ababe871f0afac8b4236504477de11a6baf07' Mar 25 01:15:53.192006 kernel: Key type .fscrypt registered Mar 25 01:15:53.192025 kernel: Key type fscrypt-provisioning registered Mar 25 01:15:53.192078 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:15:53.192101 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:15:53.192120 kernel: ima: No architecture policies found Mar 25 01:15:53.192139 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 25 01:15:53.192156 kernel: clk: Disabling unused clocks Mar 25 01:15:53.192174 kernel: Freeing unused kernel memory: 38464K Mar 25 01:15:53.192196 kernel: Run /init as init process Mar 25 01:15:53.192214 kernel: with arguments: Mar 25 01:15:53.192259 kernel: /init Mar 25 01:15:53.192286 kernel: with environment: Mar 25 01:15:53.192304 kernel: HOME=/ Mar 25 01:15:53.192322 kernel: TERM=linux Mar 25 01:15:53.192339 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:15:53.192364 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:15:53.192394 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:15:53.192415 systemd[1]: Detected virtualization amazon. Mar 25 01:15:53.192438 systemd[1]: Detected architecture arm64. Mar 25 01:15:53.192457 systemd[1]: Running in initrd. Mar 25 01:15:53.192476 systemd[1]: No hostname configured, using default hostname. Mar 25 01:15:53.192496 systemd[1]: Hostname set to . Mar 25 01:15:53.192515 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:15:53.192533 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:15:53.192553 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:15:53.192572 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:15:53.192592 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:15:53.192616 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:15:53.192636 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:15:53.192657 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:15:53.192679 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:15:53.192698 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:15:53.192717 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:15:53.192741 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:15:53.192761 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:15:53.192780 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:15:53.192799 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:15:53.192818 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:15:53.192837 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:15:53.192856 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:15:53.192875 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:15:53.192894 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:15:53.192918 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:15:53.192937 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:15:53.192956 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:15:53.192975 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:15:53.192994 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:15:53.193013 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:15:53.193032 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:15:53.193051 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:15:53.193077 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:15:53.193097 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:15:53.193116 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:15:53.193135 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:15:53.193200 systemd-journald[251]: Collecting audit messages is disabled. Mar 25 01:15:53.193997 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:15:53.194020 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:15:53.194040 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:15:53.194059 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:15:53.194084 systemd-journald[251]: Journal started Mar 25 01:15:53.194121 systemd-journald[251]: Runtime Journal (/run/log/journal/ec287eb0e5007473b1f8f71045770b48) is 8M, max 75.3M, 67.3M free. Mar 25 01:15:53.198257 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:15:53.161734 systemd-modules-load[253]: Inserted module 'overlay' Mar 25 01:15:53.202331 systemd-modules-load[253]: Inserted module 'br_netfilter' Mar 25 01:15:53.205385 kernel: Bridge firewalling registered Mar 25 01:15:53.211122 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:15:53.214492 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:53.226651 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:15:53.237442 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:15:53.244438 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:15:53.256549 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:15:53.262435 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:15:53.291861 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:15:53.307372 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:15:53.317491 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:15:53.324285 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:15:53.345092 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:15:53.351728 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:15:53.397257 dracut-cmdline[291]: dracut-dracut-053 Mar 25 01:15:53.400610 dracut-cmdline[291]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:15:53.447848 systemd-resolved[287]: Positive Trust Anchors: Mar 25 01:15:53.450314 systemd-resolved[287]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:15:53.466485 systemd-resolved[287]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:15:53.543264 kernel: SCSI subsystem initialized Mar 25 01:15:53.551257 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:15:53.563269 kernel: iscsi: registered transport (tcp) Mar 25 01:15:53.585272 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:15:53.585343 kernel: QLogic iSCSI HBA Driver Mar 25 01:15:53.660268 kernel: random: crng init done Mar 25 01:15:53.660658 systemd-resolved[287]: Defaulting to hostname 'linux'. Mar 25 01:15:53.666296 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:15:53.675651 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:15:53.685283 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:15:53.689428 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:15:53.733974 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:15:53.734056 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:15:53.734082 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:15:53.800289 kernel: raid6: neonx8 gen() 6569 MB/s Mar 25 01:15:53.817257 kernel: raid6: neonx4 gen() 6556 MB/s Mar 25 01:15:53.834255 kernel: raid6: neonx2 gen() 5455 MB/s Mar 25 01:15:53.851256 kernel: raid6: neonx1 gen() 3958 MB/s Mar 25 01:15:53.868256 kernel: raid6: int64x8 gen() 3629 MB/s Mar 25 01:15:53.885257 kernel: raid6: int64x4 gen() 3717 MB/s Mar 25 01:15:53.902255 kernel: raid6: int64x2 gen() 3602 MB/s Mar 25 01:15:53.920022 kernel: raid6: int64x1 gen() 2758 MB/s Mar 25 01:15:53.920054 kernel: raid6: using algorithm neonx8 gen() 6569 MB/s Mar 25 01:15:53.937994 kernel: raid6: .... xor() 4727 MB/s, rmw enabled Mar 25 01:15:53.938030 kernel: raid6: using neon recovery algorithm Mar 25 01:15:53.945991 kernel: xor: measuring software checksum speed Mar 25 01:15:53.946040 kernel: 8regs : 12941 MB/sec Mar 25 01:15:53.947256 kernel: 32regs : 11640 MB/sec Mar 25 01:15:53.949258 kernel: arm64_neon : 9014 MB/sec Mar 25 01:15:53.949291 kernel: xor: using function: 8regs (12941 MB/sec) Mar 25 01:15:54.031269 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:15:54.051294 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:15:54.060340 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:15:54.113018 systemd-udevd[472]: Using default interface naming scheme 'v255'. Mar 25 01:15:54.123189 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:15:54.136736 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:15:54.180269 dracut-pre-trigger[481]: rd.md=0: removing MD RAID activation Mar 25 01:15:54.236953 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:15:54.242460 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:15:54.364974 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:15:54.378551 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:15:54.428750 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:15:54.433653 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:15:54.438767 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:15:54.443652 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:15:54.454880 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:15:54.505579 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:15:54.571559 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 25 01:15:54.571633 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Mar 25 01:15:54.589216 kernel: ena 0000:00:05.0: ENA device version: 0.10 Mar 25 01:15:54.589527 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Mar 25 01:15:54.589762 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:97:cd:1d:55:ff Mar 25 01:15:54.592987 (udev-worker)[534]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:15:54.601623 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:15:54.601875 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:15:54.614724 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:15:54.626673 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 25 01:15:54.626719 kernel: nvme nvme0: pci function 0000:00:04.0 Mar 25 01:15:54.624779 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:15:54.625065 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:54.630905 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:15:54.647323 kernel: nvme nvme0: 2/0/0 default/read/poll queues Mar 25 01:15:54.643840 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:15:54.650959 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:15:54.665251 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:15:54.665298 kernel: GPT:9289727 != 16777215 Mar 25 01:15:54.665323 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:15:54.666967 kernel: GPT:9289727 != 16777215 Mar 25 01:15:54.667000 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:15:54.669341 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 25 01:15:54.687392 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:54.698481 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:15:54.745499 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:15:54.771503 kernel: BTRFS: device fsid bf348154-9cb1-474d-801c-0e035a5758cf devid 1 transid 39 /dev/nvme0n1p3 scanned by (udev-worker) (539) Mar 25 01:15:54.789264 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 scanned by (udev-worker) (522) Mar 25 01:15:54.862055 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Mar 25 01:15:54.898965 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Mar 25 01:15:54.954301 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 25 01:15:54.971923 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Mar 25 01:15:54.972113 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Mar 25 01:15:54.976469 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:15:55.014468 disk-uuid[665]: Primary Header is updated. Mar 25 01:15:55.014468 disk-uuid[665]: Secondary Entries is updated. Mar 25 01:15:55.014468 disk-uuid[665]: Secondary Header is updated. Mar 25 01:15:55.027285 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 25 01:15:55.036267 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 25 01:15:56.048443 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 25 01:15:56.049980 disk-uuid[666]: The operation has completed successfully. Mar 25 01:15:56.242831 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:15:56.243059 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:15:56.322783 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:15:56.349212 sh[924]: Success Mar 25 01:15:56.373286 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 25 01:15:56.486858 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:15:56.498429 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:15:56.522346 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:15:56.541360 kernel: BTRFS info (device dm-0): first mount of filesystem bf348154-9cb1-474d-801c-0e035a5758cf Mar 25 01:15:56.541421 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:56.543122 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:15:56.544382 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:15:56.545428 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:15:56.580260 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 25 01:15:56.585043 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:15:56.588003 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:15:56.590451 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:15:56.596560 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:15:56.657014 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:56.657097 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:56.658746 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 25 01:15:56.665275 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 25 01:15:56.673455 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:56.678212 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:15:56.685828 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:15:56.792836 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:15:56.806492 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:15:56.878829 ignition[1042]: Ignition 2.20.0 Mar 25 01:15:56.878857 ignition[1042]: Stage: fetch-offline Mar 25 01:15:56.879336 ignition[1042]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:56.879361 ignition[1042]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:15:56.887944 ignition[1042]: Ignition finished successfully Mar 25 01:15:56.892362 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:15:56.894097 systemd-networkd[1116]: lo: Link UP Mar 25 01:15:56.894104 systemd-networkd[1116]: lo: Gained carrier Mar 25 01:15:56.897746 systemd-networkd[1116]: Enumeration completed Mar 25 01:15:56.898673 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:15:56.900498 systemd-networkd[1116]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:15:56.900505 systemd-networkd[1116]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:15:56.906979 systemd[1]: Reached target network.target - Network. Mar 25 01:15:56.913448 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:15:56.920752 systemd-networkd[1116]: eth0: Link UP Mar 25 01:15:56.920760 systemd-networkd[1116]: eth0: Gained carrier Mar 25 01:15:56.920777 systemd-networkd[1116]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:15:56.959343 systemd-networkd[1116]: eth0: DHCPv4 address 172.31.24.136/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 25 01:15:56.964658 ignition[1125]: Ignition 2.20.0 Mar 25 01:15:56.964680 ignition[1125]: Stage: fetch Mar 25 01:15:56.965289 ignition[1125]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:56.965315 ignition[1125]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:15:56.965493 ignition[1125]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:15:56.987696 ignition[1125]: PUT result: OK Mar 25 01:15:56.990486 ignition[1125]: parsed url from cmdline: "" Mar 25 01:15:56.990509 ignition[1125]: no config URL provided Mar 25 01:15:56.990525 ignition[1125]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:15:56.990553 ignition[1125]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:15:56.990587 ignition[1125]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:15:56.999490 ignition[1125]: PUT result: OK Mar 25 01:15:56.999760 ignition[1125]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Mar 25 01:15:57.004354 ignition[1125]: GET result: OK Mar 25 01:15:57.005919 ignition[1125]: parsing config with SHA512: 33ff0f1f5e7d46259464d48b8e4852b5df9c941d437c96cdfd297226c6db5bb4923496f9f00bfe6d54c6366a507c5306a63ea3bf66e963759427e7f36876f6f9 Mar 25 01:15:57.013586 unknown[1125]: fetched base config from "system" Mar 25 01:15:57.013610 unknown[1125]: fetched base config from "system" Mar 25 01:15:57.014528 ignition[1125]: fetch: fetch complete Mar 25 01:15:57.013624 unknown[1125]: fetched user config from "aws" Mar 25 01:15:57.014541 ignition[1125]: fetch: fetch passed Mar 25 01:15:57.014629 ignition[1125]: Ignition finished successfully Mar 25 01:15:57.027686 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:15:57.034658 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:15:57.079206 ignition[1132]: Ignition 2.20.0 Mar 25 01:15:57.079716 ignition[1132]: Stage: kargs Mar 25 01:15:57.080325 ignition[1132]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:57.080360 ignition[1132]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:15:57.080548 ignition[1132]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:15:57.085694 ignition[1132]: PUT result: OK Mar 25 01:15:57.095623 ignition[1132]: kargs: kargs passed Mar 25 01:15:57.095956 ignition[1132]: Ignition finished successfully Mar 25 01:15:57.101850 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:15:57.109647 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:15:57.146299 ignition[1138]: Ignition 2.20.0 Mar 25 01:15:57.146321 ignition[1138]: Stage: disks Mar 25 01:15:57.146866 ignition[1138]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:57.146890 ignition[1138]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:15:57.147038 ignition[1138]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:15:57.149522 ignition[1138]: PUT result: OK Mar 25 01:15:57.161302 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:15:57.155416 ignition[1138]: disks: disks passed Mar 25 01:15:57.155508 ignition[1138]: Ignition finished successfully Mar 25 01:15:57.170468 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:15:57.173132 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:15:57.175995 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:15:57.178361 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:15:57.180818 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:15:57.184663 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:15:57.254355 systemd-fsck[1146]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 25 01:15:57.260280 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:15:57.269464 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:15:57.367255 kernel: EXT4-fs (nvme0n1p9): mounted filesystem a7a89271-ee7d-4bda-a834-705261d6cda9 r/w with ordered data mode. Quota mode: none. Mar 25 01:15:57.368768 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:15:57.374798 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:15:57.386449 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:15:57.392439 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:15:57.396922 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 25 01:15:57.397010 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:15:57.397062 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:15:57.427567 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:15:57.434328 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:15:57.458270 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 scanned by mount (1165) Mar 25 01:15:57.462212 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:57.462382 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:57.462410 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 25 01:15:57.477254 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 25 01:15:57.484504 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:15:57.540690 initrd-setup-root[1189]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:15:57.553551 initrd-setup-root[1196]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:15:57.564197 initrd-setup-root[1203]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:15:57.572113 initrd-setup-root[1210]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:15:57.742209 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:15:57.751393 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:15:57.754520 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:15:57.781601 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:15:57.785350 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:57.824289 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:15:57.833183 ignition[1282]: INFO : Ignition 2.20.0 Mar 25 01:15:57.833183 ignition[1282]: INFO : Stage: mount Mar 25 01:15:57.833183 ignition[1282]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:57.833183 ignition[1282]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:15:57.833183 ignition[1282]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:15:57.833183 ignition[1282]: INFO : PUT result: OK Mar 25 01:15:57.848331 ignition[1282]: INFO : mount: mount passed Mar 25 01:15:57.850865 ignition[1282]: INFO : Ignition finished successfully Mar 25 01:15:57.853415 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:15:57.858155 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:15:57.885213 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:15:57.927282 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/nvme0n1p6 scanned by mount (1293) Mar 25 01:15:57.930798 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:57.930837 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:57.930862 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 25 01:15:57.937633 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 25 01:15:57.940599 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:15:57.979254 ignition[1310]: INFO : Ignition 2.20.0 Mar 25 01:15:57.979254 ignition[1310]: INFO : Stage: files Mar 25 01:15:57.983974 ignition[1310]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:57.983974 ignition[1310]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:15:57.983974 ignition[1310]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:15:57.994365 ignition[1310]: INFO : PUT result: OK Mar 25 01:15:57.997770 ignition[1310]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:15:58.010318 ignition[1310]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:15:58.010318 ignition[1310]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:15:58.016976 ignition[1310]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:15:58.016976 ignition[1310]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:15:58.016976 ignition[1310]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:15:58.014835 unknown[1310]: wrote ssh authorized keys file for user: core Mar 25 01:15:58.028505 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:15:58.028505 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 25 01:15:58.235794 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:15:58.339474 systemd-networkd[1116]: eth0: Gained IPv6LL Mar 25 01:15:58.638057 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:15:58.645184 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:15:58.649756 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:15:58.649756 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:15:58.649756 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:15:58.649756 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:15:58.665735 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Mar 25 01:15:59.043337 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:15:59.413354 ignition[1310]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:15:59.420644 ignition[1310]: INFO : files: files passed Mar 25 01:15:59.420644 ignition[1310]: INFO : Ignition finished successfully Mar 25 01:15:59.454682 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:15:59.459556 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:15:59.467495 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:15:59.487866 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:15:59.489645 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:15:59.505461 initrd-setup-root-after-ignition[1340]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:15:59.505461 initrd-setup-root-after-ignition[1340]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:15:59.513788 initrd-setup-root-after-ignition[1344]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:15:59.519982 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:15:59.526611 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:15:59.530404 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:15:59.614016 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:15:59.614332 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:15:59.620127 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:15:59.622560 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:15:59.624699 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:15:59.628468 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:15:59.668676 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:15:59.677343 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:15:59.714442 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:15:59.714781 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:15:59.716353 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:15:59.716933 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:15:59.717261 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:15:59.718287 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:15:59.719045 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:15:59.719749 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:15:59.720419 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:15:59.721104 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:15:59.721818 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:15:59.722517 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:15:59.723216 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:15:59.723903 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:15:59.724604 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:15:59.725193 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:15:59.725507 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:15:59.727315 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:15:59.728084 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:15:59.728701 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:15:59.758664 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:15:59.758893 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:15:59.759111 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:15:59.760267 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:15:59.760480 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:15:59.760955 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:15:59.761142 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:15:59.827289 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:15:59.842838 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:15:59.845040 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:15:59.850382 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:15:59.856414 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:15:59.857302 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:15:59.880824 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:15:59.883182 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:15:59.895901 ignition[1364]: INFO : Ignition 2.20.0 Mar 25 01:15:59.898134 ignition[1364]: INFO : Stage: umount Mar 25 01:15:59.900132 ignition[1364]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:59.900132 ignition[1364]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:15:59.900132 ignition[1364]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:15:59.912180 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:15:59.928146 ignition[1364]: INFO : PUT result: OK Mar 25 01:15:59.933595 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:15:59.937351 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:15:59.941493 ignition[1364]: INFO : umount: umount passed Mar 25 01:15:59.941493 ignition[1364]: INFO : Ignition finished successfully Mar 25 01:15:59.946941 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:15:59.947337 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:15:59.953123 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:15:59.953325 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:15:59.961123 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:15:59.961252 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:15:59.963662 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:15:59.963744 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:15:59.966136 systemd[1]: Stopped target network.target - Network. Mar 25 01:15:59.968194 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:15:59.968318 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:15:59.971114 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:15:59.973182 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:15:59.992866 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:15:59.995752 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:15:59.997847 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:16:00.000109 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:16:00.000188 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:16:00.002519 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:16:00.002585 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:16:00.004957 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:16:00.005045 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:16:00.007896 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:16:00.007977 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:16:00.034371 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:16:00.034475 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:16:00.037187 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:16:00.039592 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:16:00.058034 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:16:00.058506 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:16:00.082961 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:16:00.084088 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:16:00.084399 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:16:00.091743 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:16:00.097370 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:16:00.097498 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:16:00.104366 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:16:00.115454 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:16:00.115580 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:16:00.118251 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:16:00.118337 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:16:00.123274 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:16:00.123365 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:16:00.138168 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:16:00.138289 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:16:00.146576 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:16:00.157897 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:16:00.158025 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:16:00.170065 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:16:00.170560 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:16:00.179030 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:16:00.179698 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:16:00.186770 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:16:00.186854 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:16:00.189478 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:16:00.189568 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:16:00.192187 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:16:00.192293 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:16:00.194894 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:16:00.194980 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:16:00.219586 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:16:00.222377 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:16:00.222516 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:16:00.228803 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 25 01:16:00.228911 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:16:00.235214 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:16:00.235332 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:16:00.238168 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:16:00.238266 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:16:00.261832 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 01:16:00.262608 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:16:00.263437 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:16:00.263676 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:16:00.278058 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:16:00.279254 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:16:00.286508 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:16:00.290726 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:16:00.313677 systemd[1]: Switching root. Mar 25 01:16:00.349972 systemd-journald[251]: Journal stopped Mar 25 01:16:02.341108 systemd-journald[251]: Received SIGTERM from PID 1 (systemd). Mar 25 01:16:02.341257 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:16:02.341368 kernel: SELinux: policy capability open_perms=1 Mar 25 01:16:02.341402 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:16:02.341442 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:16:02.341479 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:16:02.341509 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:16:02.341541 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:16:02.341569 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:16:02.341597 kernel: audit: type=1403 audit(1742865360.634:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:16:02.341637 systemd[1]: Successfully loaded SELinux policy in 49.833ms. Mar 25 01:16:02.341690 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 24.109ms. Mar 25 01:16:02.341723 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:16:02.341753 systemd[1]: Detected virtualization amazon. Mar 25 01:16:02.341787 systemd[1]: Detected architecture arm64. Mar 25 01:16:02.341820 systemd[1]: Detected first boot. Mar 25 01:16:02.341850 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:16:02.341879 zram_generator::config[1410]: No configuration found. Mar 25 01:16:02.341914 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:16:02.341948 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:16:02.341981 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:16:02.342010 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:16:02.342043 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:16:02.342074 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:16:02.342125 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:16:02.342162 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:16:02.342194 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:16:02.342248 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:16:02.342288 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:16:02.342320 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:16:02.342358 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:16:02.342387 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:16:02.342418 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:16:02.347351 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:16:02.347427 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:16:02.347461 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:16:02.347491 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:16:02.347526 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:16:02.347557 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 25 01:16:02.347599 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:16:02.347629 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:16:02.347658 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:16:02.347689 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:16:02.347737 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:16:02.347769 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:16:02.347799 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:16:02.347828 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:16:02.347861 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:16:02.347892 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:16:02.347923 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:16:02.347953 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:16:02.347984 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:16:02.348015 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:16:02.348044 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:16:02.348074 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:16:02.348103 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:16:02.348138 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:16:02.348170 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:16:02.348201 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:16:02.348264 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:16:02.348300 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:16:02.348335 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:16:02.348367 systemd[1]: Reached target machines.target - Containers. Mar 25 01:16:02.348396 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:16:02.348427 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:16:02.348464 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:16:02.348496 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:16:02.348526 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:16:02.348555 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:16:02.348585 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:16:02.348614 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:16:02.348642 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:16:02.348673 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:16:02.348707 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:16:02.348736 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:16:02.348768 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:16:02.348797 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:16:02.348829 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:16:02.348858 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:16:02.348886 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:16:02.348913 kernel: loop: module loaded Mar 25 01:16:02.348942 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:16:02.348975 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:16:02.349004 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:16:02.349031 kernel: ACPI: bus type drm_connector registered Mar 25 01:16:02.349063 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:16:02.349094 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:16:02.349122 systemd[1]: Stopped verity-setup.service. Mar 25 01:16:02.349152 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:16:02.349184 kernel: fuse: init (API version 7.39) Mar 25 01:16:02.349216 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:16:02.349307 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:16:02.349338 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:16:02.349369 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:16:02.349400 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:16:02.349436 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:16:02.349465 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:16:02.349495 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:16:02.349524 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:16:02.349552 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:16:02.349580 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:16:02.349613 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:16:02.349643 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:16:02.349672 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:16:02.349746 systemd-journald[1499]: Collecting audit messages is disabled. Mar 25 01:16:02.349799 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:16:02.349829 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:16:02.349858 systemd-journald[1499]: Journal started Mar 25 01:16:02.349908 systemd-journald[1499]: Runtime Journal (/run/log/journal/ec287eb0e5007473b1f8f71045770b48) is 8M, max 75.3M, 67.3M free. Mar 25 01:16:01.737204 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:16:01.749522 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Mar 25 01:16:01.750403 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:16:02.360746 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:16:02.367461 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:16:02.371285 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:16:02.371773 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:16:02.376762 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:16:02.383346 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:16:02.390288 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:16:02.395946 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:16:02.425953 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:16:02.436374 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:16:02.446596 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:16:02.451433 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:16:02.451507 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:16:02.459724 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:16:02.472533 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:16:02.481727 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:16:02.486258 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:16:02.490712 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:16:02.501640 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:16:02.506474 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:16:02.512591 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:16:02.517261 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:16:02.525844 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:16:02.538858 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:16:02.555312 systemd-journald[1499]: Time spent on flushing to /var/log/journal/ec287eb0e5007473b1f8f71045770b48 is 128.263ms for 916 entries. Mar 25 01:16:02.555312 systemd-journald[1499]: System Journal (/var/log/journal/ec287eb0e5007473b1f8f71045770b48) is 8M, max 195.6M, 187.6M free. Mar 25 01:16:02.701404 systemd-journald[1499]: Received client request to flush runtime journal. Mar 25 01:16:02.701467 kernel: loop0: detected capacity change from 0 to 54976 Mar 25 01:16:02.571809 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:16:02.583736 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:16:02.589056 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:16:02.594838 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:16:02.622937 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:16:02.628164 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:16:02.640615 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:16:02.646252 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:16:02.660673 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:16:02.702295 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:16:02.721461 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:16:02.734509 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:16:02.753176 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:16:02.764041 udevadm[1554]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 01:16:02.782267 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:16:02.778048 systemd-tmpfiles[1546]: ACLs are not supported, ignoring. Mar 25 01:16:02.778088 systemd-tmpfiles[1546]: ACLs are not supported, ignoring. Mar 25 01:16:02.804044 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:16:02.814652 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:16:02.821735 kernel: loop1: detected capacity change from 0 to 194096 Mar 25 01:16:02.899333 kernel: loop2: detected capacity change from 0 to 126448 Mar 25 01:16:02.931591 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:16:02.946523 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:16:02.990322 kernel: loop3: detected capacity change from 0 to 103832 Mar 25 01:16:03.017911 systemd-tmpfiles[1569]: ACLs are not supported, ignoring. Mar 25 01:16:03.017950 systemd-tmpfiles[1569]: ACLs are not supported, ignoring. Mar 25 01:16:03.036129 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:16:03.066574 kernel: loop4: detected capacity change from 0 to 54976 Mar 25 01:16:03.092269 kernel: loop5: detected capacity change from 0 to 194096 Mar 25 01:16:03.138283 kernel: loop6: detected capacity change from 0 to 126448 Mar 25 01:16:03.174265 kernel: loop7: detected capacity change from 0 to 103832 Mar 25 01:16:03.204745 (sd-merge)[1574]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Mar 25 01:16:03.207280 (sd-merge)[1574]: Merged extensions into '/usr'. Mar 25 01:16:03.217127 systemd[1]: Reload requested from client PID 1545 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:16:03.217163 systemd[1]: Reloading... Mar 25 01:16:03.427307 zram_generator::config[1598]: No configuration found. Mar 25 01:16:03.449666 ldconfig[1540]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:16:03.736620 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:16:03.889313 systemd[1]: Reloading finished in 670 ms. Mar 25 01:16:03.915317 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:16:03.918968 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:16:03.922624 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:16:03.941987 systemd[1]: Starting ensure-sysext.service... Mar 25 01:16:03.947693 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:16:03.954549 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:16:03.995516 systemd[1]: Reload requested from client PID 1655 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:16:03.995544 systemd[1]: Reloading... Mar 25 01:16:04.009644 systemd-tmpfiles[1656]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:16:04.010180 systemd-tmpfiles[1656]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:16:04.012021 systemd-tmpfiles[1656]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:16:04.012645 systemd-tmpfiles[1656]: ACLs are not supported, ignoring. Mar 25 01:16:04.012784 systemd-tmpfiles[1656]: ACLs are not supported, ignoring. Mar 25 01:16:04.019902 systemd-tmpfiles[1656]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:16:04.019929 systemd-tmpfiles[1656]: Skipping /boot Mar 25 01:16:04.043178 systemd-tmpfiles[1656]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:16:04.043208 systemd-tmpfiles[1656]: Skipping /boot Mar 25 01:16:04.088754 systemd-udevd[1657]: Using default interface naming scheme 'v255'. Mar 25 01:16:04.263310 zram_generator::config[1700]: No configuration found. Mar 25 01:16:04.326835 (udev-worker)[1703]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:16:04.621871 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:16:04.645311 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (1705) Mar 25 01:16:04.805854 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 25 01:16:04.806132 systemd[1]: Reloading finished in 809 ms. Mar 25 01:16:04.833426 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:16:04.867381 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:16:04.903303 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:16:04.919329 systemd[1]: Finished ensure-sysext.service. Mar 25 01:16:04.971070 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 25 01:16:04.976776 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:16:04.984505 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:16:04.987560 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:16:04.989966 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:16:04.996621 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:16:05.004699 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:16:05.013465 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:16:05.023735 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:16:05.027701 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:16:05.030726 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:16:05.046427 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:16:05.052135 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:16:05.057262 lvm[1856]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:16:05.060674 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:16:05.071653 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:16:05.074487 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:16:05.084607 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:16:05.117771 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:16:05.127705 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:16:05.142553 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:16:05.146628 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:16:05.168908 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:16:05.188677 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:16:05.189314 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:16:05.201381 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:16:05.216165 augenrules[1892]: No rules Mar 25 01:16:05.201880 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:16:05.219562 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:16:05.220610 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:16:05.227428 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:16:05.227610 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:16:05.235329 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:16:05.235801 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:16:05.246615 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:16:05.257508 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:16:05.261578 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:16:05.267413 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:16:05.288807 lvm[1903]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:16:05.314061 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:16:05.325494 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:16:05.330991 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:16:05.367892 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:16:05.372713 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:16:05.394343 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:16:05.410613 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:16:05.414168 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:16:05.533056 systemd-networkd[1869]: lo: Link UP Mar 25 01:16:05.533076 systemd-networkd[1869]: lo: Gained carrier Mar 25 01:16:05.536091 systemd-networkd[1869]: Enumeration completed Mar 25 01:16:05.536329 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:16:05.541393 systemd-networkd[1869]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:16:05.541416 systemd-networkd[1869]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:16:05.544501 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:16:05.549868 systemd-networkd[1869]: eth0: Link UP Mar 25 01:16:05.550171 systemd-networkd[1869]: eth0: Gained carrier Mar 25 01:16:05.550205 systemd-networkd[1869]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:16:05.554630 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:16:05.560185 systemd-resolved[1870]: Positive Trust Anchors: Mar 25 01:16:05.560682 systemd-resolved[1870]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:16:05.560751 systemd-resolved[1870]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:16:05.566381 systemd-networkd[1869]: eth0: DHCPv4 address 172.31.24.136/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 25 01:16:05.572045 systemd-resolved[1870]: Defaulting to hostname 'linux'. Mar 25 01:16:05.577085 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:16:05.582028 systemd[1]: Reached target network.target - Network. Mar 25 01:16:05.585988 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:16:05.590820 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:16:05.595289 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:16:05.600470 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:16:05.604660 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:16:05.607779 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:16:05.610672 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:16:05.613555 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:16:05.613611 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:16:05.615628 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:16:05.618680 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:16:05.623837 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:16:05.631543 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:16:05.635674 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:16:05.639369 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:16:05.651522 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:16:05.655444 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:16:05.659907 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:16:05.663652 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:16:05.667274 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:16:05.670356 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:16:05.673033 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:16:05.673218 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:16:05.675400 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:16:05.684536 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:16:05.690750 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:16:05.700614 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:16:05.706614 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:16:05.709126 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:16:05.714678 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:16:05.726458 systemd[1]: Started ntpd.service - Network Time Service. Mar 25 01:16:05.736062 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:16:05.743531 systemd[1]: Starting setup-oem.service - Setup OEM... Mar 25 01:16:05.755700 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:16:05.772858 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:16:05.788651 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:16:05.792914 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:16:05.793559 extend-filesystems[1929]: Found loop4 Mar 25 01:16:05.793559 extend-filesystems[1929]: Found loop5 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found loop6 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found loop7 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1p1 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1p2 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1p3 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found usr Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1p4 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1p6 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1p7 Mar 25 01:16:05.803424 extend-filesystems[1929]: Found nvme0n1p9 Mar 25 01:16:05.803424 extend-filesystems[1929]: Checking size of /dev/nvme0n1p9 Mar 25 01:16:05.795793 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:16:05.891565 jq[1928]: false Mar 25 01:16:05.804857 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:16:05.835561 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:16:05.889434 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:16:05.889849 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:16:05.896368 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:16:05.896797 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:16:05.938954 extend-filesystems[1929]: Resized partition /dev/nvme0n1p9 Mar 25 01:16:05.942717 extend-filesystems[1964]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:16:05.977273 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Mar 25 01:16:05.971510 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:16:05.971179 dbus-daemon[1927]: [system] SELinux support is enabled Mar 25 01:16:05.981853 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:16:05.981904 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:16:05.987109 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:16:06.001765 update_engine[1941]: I20250325 01:16:05.994175 1941 main.cc:92] Flatcar Update Engine starting Mar 25 01:16:05.988353 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:16:06.015756 jq[1943]: true Mar 25 01:16:06.012412 dbus-daemon[1927]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1869 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 25 01:16:06.024416 dbus-daemon[1927]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 25 01:16:06.041008 update_engine[1941]: I20250325 01:16:06.039489 1941 update_check_scheduler.cc:74] Next update check in 10m54s Mar 25 01:16:06.040828 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:16:06.045786 tar[1960]: linux-arm64/helm Mar 25 01:16:06.060494 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 25 01:16:06.071643 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:16:06.085105 (ntainerd)[1968]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:16:06.085873 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:16:06.086356 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:16:06.154641 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: ntpd 4.2.8p17@1.4004-o Mon Mar 24 23:09:33 UTC 2025 (1): Starting Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: ---------------------------------------------------- Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: ntp-4 is maintained by Network Time Foundation, Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: corporation. Support and training for ntp-4 are Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: available at https://www.nwtime.org/support Mar 25 01:16:06.154765 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: ---------------------------------------------------- Mar 25 01:16:06.149946 ntpd[1931]: ntpd 4.2.8p17@1.4004-o Mon Mar 24 23:09:33 UTC 2025 (1): Starting Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: proto: precision = 0.096 usec (-23) Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: basedate set to 2025-03-12 Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: gps base set to 2025-03-16 (week 2358) Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Listen and drop on 0 v6wildcard [::]:123 Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Listen normally on 2 lo 127.0.0.1:123 Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Listen normally on 3 eth0 172.31.24.136:123 Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Listen normally on 4 lo [::1]:123 Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: bind(21) AF_INET6 fe80::497:cdff:fe1d:55ff%2#123 flags 0x11 failed: Cannot assign requested address Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: unable to create socket on eth0 (5) for fe80::497:cdff:fe1d:55ff%2#123 Mar 25 01:16:06.197781 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: failed to init interface for address fe80::497:cdff:fe1d:55ff%2 Mar 25 01:16:06.149996 ntpd[1931]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 25 01:16:06.191945 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:16:06.198607 extend-filesystems[1964]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Mar 25 01:16:06.198607 extend-filesystems[1964]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 25 01:16:06.198607 extend-filesystems[1964]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Mar 25 01:16:06.150017 ntpd[1931]: ---------------------------------------------------- Mar 25 01:16:06.196505 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:16:06.229045 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: Listening on routing socket on fd #21 for interface updates Mar 25 01:16:06.229094 extend-filesystems[1929]: Resized filesystem in /dev/nvme0n1p9 Mar 25 01:16:06.236538 jq[1972]: true Mar 25 01:16:06.150036 ntpd[1931]: ntp-4 is maintained by Network Time Foundation, Mar 25 01:16:06.251430 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:16:06.251430 ntpd[1931]: 25 Mar 01:16:06 ntpd[1931]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:16:06.248370 systemd[1]: Finished setup-oem.service - Setup OEM. Mar 25 01:16:06.150054 ntpd[1931]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 25 01:16:06.150072 ntpd[1931]: corporation. Support and training for ntp-4 are Mar 25 01:16:06.150117 ntpd[1931]: available at https://www.nwtime.org/support Mar 25 01:16:06.150138 ntpd[1931]: ---------------------------------------------------- Mar 25 01:16:06.170506 ntpd[1931]: proto: precision = 0.096 usec (-23) Mar 25 01:16:06.174528 ntpd[1931]: basedate set to 2025-03-12 Mar 25 01:16:06.174561 ntpd[1931]: gps base set to 2025-03-16 (week 2358) Mar 25 01:16:06.195465 ntpd[1931]: Listen and drop on 0 v6wildcard [::]:123 Mar 25 01:16:06.195564 ntpd[1931]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 25 01:16:06.195879 ntpd[1931]: Listen normally on 2 lo 127.0.0.1:123 Mar 25 01:16:06.195946 ntpd[1931]: Listen normally on 3 eth0 172.31.24.136:123 Mar 25 01:16:06.196018 ntpd[1931]: Listen normally on 4 lo [::1]:123 Mar 25 01:16:06.196102 ntpd[1931]: bind(21) AF_INET6 fe80::497:cdff:fe1d:55ff%2#123 flags 0x11 failed: Cannot assign requested address Mar 25 01:16:06.196142 ntpd[1931]: unable to create socket on eth0 (5) for fe80::497:cdff:fe1d:55ff%2#123 Mar 25 01:16:06.196174 ntpd[1931]: failed to init interface for address fe80::497:cdff:fe1d:55ff%2 Mar 25 01:16:06.215958 ntpd[1931]: Listening on routing socket on fd #21 for interface updates Mar 25 01:16:06.237288 ntpd[1931]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:16:06.237332 ntpd[1931]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:16:06.304383 coreos-metadata[1926]: Mar 25 01:16:06.293 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 25 01:16:06.304383 coreos-metadata[1926]: Mar 25 01:16:06.297 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Mar 25 01:16:06.304383 coreos-metadata[1926]: Mar 25 01:16:06.297 INFO Fetch successful Mar 25 01:16:06.304383 coreos-metadata[1926]: Mar 25 01:16:06.297 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Mar 25 01:16:06.310135 coreos-metadata[1926]: Mar 25 01:16:06.304 INFO Fetch successful Mar 25 01:16:06.310135 coreos-metadata[1926]: Mar 25 01:16:06.305 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Mar 25 01:16:06.310135 coreos-metadata[1926]: Mar 25 01:16:06.309 INFO Fetch successful Mar 25 01:16:06.310135 coreos-metadata[1926]: Mar 25 01:16:06.309 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Mar 25 01:16:06.318411 coreos-metadata[1926]: Mar 25 01:16:06.318 INFO Fetch successful Mar 25 01:16:06.318411 coreos-metadata[1926]: Mar 25 01:16:06.318 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Mar 25 01:16:06.326724 coreos-metadata[1926]: Mar 25 01:16:06.322 INFO Fetch failed with 404: resource not found Mar 25 01:16:06.326724 coreos-metadata[1926]: Mar 25 01:16:06.322 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Mar 25 01:16:06.329459 coreos-metadata[1926]: Mar 25 01:16:06.329 INFO Fetch successful Mar 25 01:16:06.329459 coreos-metadata[1926]: Mar 25 01:16:06.329 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Mar 25 01:16:06.332214 coreos-metadata[1926]: Mar 25 01:16:06.332 INFO Fetch successful Mar 25 01:16:06.332214 coreos-metadata[1926]: Mar 25 01:16:06.332 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Mar 25 01:16:06.337698 coreos-metadata[1926]: Mar 25 01:16:06.336 INFO Fetch successful Mar 25 01:16:06.337698 coreos-metadata[1926]: Mar 25 01:16:06.336 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Mar 25 01:16:06.337698 coreos-metadata[1926]: Mar 25 01:16:06.337 INFO Fetch successful Mar 25 01:16:06.337698 coreos-metadata[1926]: Mar 25 01:16:06.337 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Mar 25 01:16:06.340385 coreos-metadata[1926]: Mar 25 01:16:06.338 INFO Fetch successful Mar 25 01:16:06.422424 systemd-logind[1939]: Watching system buttons on /dev/input/event0 (Power Button) Mar 25 01:16:06.422477 systemd-logind[1939]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 25 01:16:06.432003 systemd-logind[1939]: New seat seat0. Mar 25 01:16:06.435924 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:16:06.528538 bash[2030]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:16:06.554510 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (1686) Mar 25 01:16:06.602715 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:16:06.615858 systemd[1]: Starting sshkeys.service... Mar 25 01:16:06.619913 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:16:06.626019 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:16:06.653297 containerd[1968]: time="2025-03-25T01:16:06Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:16:06.655863 containerd[1968]: time="2025-03-25T01:16:06.655804702Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684080026Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.28µs" Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684133690Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684170494Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684482806Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684518026Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684569206Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684685870Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.684711490Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.685125142Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.685160146Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:16:06.685260 containerd[1968]: time="2025-03-25T01:16:06.685188454Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:16:06.686139 containerd[1968]: time="2025-03-25T01:16:06.685210762Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:16:06.686367 containerd[1968]: time="2025-03-25T01:16:06.686296618Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:16:06.687123 containerd[1968]: time="2025-03-25T01:16:06.687070846Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:16:06.687332 containerd[1968]: time="2025-03-25T01:16:06.687298618Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:16:06.687436 containerd[1968]: time="2025-03-25T01:16:06.687406702Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:16:06.687570 containerd[1968]: time="2025-03-25T01:16:06.687541426Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:16:06.688603 containerd[1968]: time="2025-03-25T01:16:06.688020682Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:16:06.689138 containerd[1968]: time="2025-03-25T01:16:06.688891186Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:16:06.695066 containerd[1968]: time="2025-03-25T01:16:06.694485898Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:16:06.699272 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700363450Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700419862Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700449910Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700481266Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700513150Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700554346Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700586290Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700621222Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700649410Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700697026Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:16:06.700917 containerd[1968]: time="2025-03-25T01:16:06.700727794Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:16:06.701790 containerd[1968]: time="2025-03-25T01:16:06.701751430Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:16:06.701947 containerd[1968]: time="2025-03-25T01:16:06.701918530Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:16:06.702072 containerd[1968]: time="2025-03-25T01:16:06.702044854Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:16:06.702211 containerd[1968]: time="2025-03-25T01:16:06.702182662Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:16:06.702379 containerd[1968]: time="2025-03-25T01:16:06.702265414Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:16:06.702676 containerd[1968]: time="2025-03-25T01:16:06.702295066Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:16:06.702676 containerd[1968]: time="2025-03-25T01:16:06.702536302Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:16:06.702676 containerd[1968]: time="2025-03-25T01:16:06.702565258Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:16:06.702676 containerd[1968]: time="2025-03-25T01:16:06.702620662Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:16:06.703445 containerd[1968]: time="2025-03-25T01:16:06.702650014Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:16:06.703445 containerd[1968]: time="2025-03-25T01:16:06.703102006Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:16:06.703445 containerd[1968]: time="2025-03-25T01:16:06.703380310Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:16:06.704080 containerd[1968]: time="2025-03-25T01:16:06.703421470Z" level=info msg="Start snapshots syncer" Mar 25 01:16:06.704080 containerd[1968]: time="2025-03-25T01:16:06.703928530Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:16:06.708314 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 01:16:06.709381 containerd[1968]: time="2025-03-25T01:16:06.708478486Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:16:06.709381 containerd[1968]: time="2025-03-25T01:16:06.708621478Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:16:06.709604 containerd[1968]: time="2025-03-25T01:16:06.708859198Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.709937230Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710076862Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710129314Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710158114Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710189434Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710216818Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710266474Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710323294Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710358130Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:16:06.710455 containerd[1968]: time="2025-03-25T01:16:06.710385010Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711274726Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711426034Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711452902Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711479110Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711507154Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711541558Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711573742Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711632974Z" level=info msg="runtime interface created" Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711648382Z" level=info msg="created NRI interface" Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711676810Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711708910Z" level=info msg="Connect containerd service" Mar 25 01:16:06.711827 containerd[1968]: time="2025-03-25T01:16:06.711767830Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:16:06.725760 containerd[1968]: time="2025-03-25T01:16:06.724902622Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:16:06.982738 locksmithd[1974]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:16:07.086131 coreos-metadata[2058]: Mar 25 01:16:07.086 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 25 01:16:07.089777 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 25 01:16:07.091740 dbus-daemon[1927]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 25 01:16:07.093665 dbus-daemon[1927]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1973 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 25 01:16:07.105887 coreos-metadata[2058]: Mar 25 01:16:07.104 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Mar 25 01:16:07.107290 coreos-metadata[2058]: Mar 25 01:16:07.106 INFO Fetch successful Mar 25 01:16:07.107290 coreos-metadata[2058]: Mar 25 01:16:07.106 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 25 01:16:07.107290 coreos-metadata[2058]: Mar 25 01:16:07.107 INFO Fetch successful Mar 25 01:16:07.107998 systemd[1]: Starting polkit.service - Authorization Manager... Mar 25 01:16:07.117712 unknown[2058]: wrote ssh authorized keys file for user: core Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.129914168Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.130073936Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.130127384Z" level=info msg="Start subscribing containerd event" Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.130195604Z" level=info msg="Start recovering state" Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.131896484Z" level=info msg="Start event monitor" Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.131947004Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.131971832Z" level=info msg="Start streaming server" Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.131994296Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.132020600Z" level=info msg="runtime interface starting up..." Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.132037580Z" level=info msg="starting plugins..." Mar 25 01:16:07.132253 containerd[1968]: time="2025-03-25T01:16:07.132066908Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:16:07.134137 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:16:07.136521 containerd[1968]: time="2025-03-25T01:16:07.135101252Z" level=info msg="containerd successfully booted in 0.482493s" Mar 25 01:16:07.159878 ntpd[1931]: bind(24) AF_INET6 fe80::497:cdff:fe1d:55ff%2#123 flags 0x11 failed: Cannot assign requested address Mar 25 01:16:07.183070 ntpd[1931]: 25 Mar 01:16:07 ntpd[1931]: bind(24) AF_INET6 fe80::497:cdff:fe1d:55ff%2#123 flags 0x11 failed: Cannot assign requested address Mar 25 01:16:07.183070 ntpd[1931]: 25 Mar 01:16:07 ntpd[1931]: unable to create socket on eth0 (6) for fe80::497:cdff:fe1d:55ff%2#123 Mar 25 01:16:07.183070 ntpd[1931]: 25 Mar 01:16:07 ntpd[1931]: failed to init interface for address fe80::497:cdff:fe1d:55ff%2 Mar 25 01:16:07.159943 ntpd[1931]: unable to create socket on eth0 (6) for fe80::497:cdff:fe1d:55ff%2#123 Mar 25 01:16:07.159972 ntpd[1931]: failed to init interface for address fe80::497:cdff:fe1d:55ff%2 Mar 25 01:16:07.209124 update-ssh-keys[2120]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:16:07.214291 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 01:16:07.221105 polkitd[2118]: Started polkitd version 121 Mar 25 01:16:07.223475 systemd[1]: Finished sshkeys.service. Mar 25 01:16:07.241760 polkitd[2118]: Loading rules from directory /etc/polkit-1/rules.d Mar 25 01:16:07.241893 polkitd[2118]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 25 01:16:07.243146 polkitd[2118]: Finished loading, compiling and executing 2 rules Mar 25 01:16:07.245956 dbus-daemon[1927]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 25 01:16:07.246524 polkitd[2118]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 25 01:16:07.286698 systemd[1]: Started polkit.service - Authorization Manager. Mar 25 01:16:07.303523 systemd-hostnamed[1973]: Hostname set to (transient) Mar 25 01:16:07.303713 systemd-resolved[1870]: System hostname changed to 'ip-172-31-24-136'. Mar 25 01:16:07.333707 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:16:07.491380 systemd-networkd[1869]: eth0: Gained IPv6LL Mar 25 01:16:07.499323 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:16:07.506011 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:16:07.517761 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Mar 25 01:16:07.527810 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:07.535466 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:16:07.656321 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:16:07.680141 amazon-ssm-agent[2143]: Initializing new seelog logger Mar 25 01:16:07.681484 amazon-ssm-agent[2143]: New Seelog Logger Creation Complete Mar 25 01:16:07.683397 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.683499 amazon-ssm-agent[2143]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.688272 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 processing appconfig overrides Mar 25 01:16:07.688272 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.688272 amazon-ssm-agent[2143]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.688272 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 processing appconfig overrides Mar 25 01:16:07.688272 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.688272 amazon-ssm-agent[2143]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.688272 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 processing appconfig overrides Mar 25 01:16:07.689731 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO Proxy environment variables: Mar 25 01:16:07.698377 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.699516 amazon-ssm-agent[2143]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:16:07.699818 amazon-ssm-agent[2143]: 2025/03/25 01:16:07 processing appconfig overrides Mar 25 01:16:07.793608 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO https_proxy: Mar 25 01:16:07.894839 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO http_proxy: Mar 25 01:16:07.944534 tar[1960]: linux-arm64/LICENSE Mar 25 01:16:07.944534 tar[1960]: linux-arm64/README.md Mar 25 01:16:07.988108 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:16:07.994347 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO no_proxy: Mar 25 01:16:08.096428 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO Checking if agent identity type OnPrem can be assumed Mar 25 01:16:08.194548 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO Checking if agent identity type EC2 can be assumed Mar 25 01:16:08.293819 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO Agent will take identity from EC2 Mar 25 01:16:08.395360 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 25 01:16:08.496541 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 25 01:16:08.596266 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 25 01:16:08.605425 sshd_keygen[1948]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:16:08.648581 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:16:08.658718 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:16:08.665657 systemd[1]: Started sshd@0-172.31.24.136:22-147.75.109.163:60284.service - OpenSSH per-connection server daemon (147.75.109.163:60284). Mar 25 01:16:08.697249 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Mar 25 01:16:08.723011 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:16:08.726325 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:16:08.738316 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:16:08.778642 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:16:08.792468 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:16:08.797379 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Mar 25 01:16:08.802047 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 25 01:16:08.808016 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:16:08.897810 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [amazon-ssm-agent] Starting Core Agent Mar 25 01:16:08.962298 sshd[2174]: Accepted publickey for core from 147.75.109.163 port 60284 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:16:08.967971 sshd-session[2174]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:08.994626 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:16:08.999330 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [amazon-ssm-agent] registrar detected. Attempting registration Mar 25 01:16:09.002554 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [Registrar] Starting registrar module Mar 25 01:16:09.002554 amazon-ssm-agent[2143]: 2025-03-25 01:16:07 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Mar 25 01:16:09.002554 amazon-ssm-agent[2143]: 2025-03-25 01:16:08 INFO [EC2Identity] EC2 registration was successful. Mar 25 01:16:09.002554 amazon-ssm-agent[2143]: 2025-03-25 01:16:08 INFO [CredentialRefresher] credentialRefresher has started Mar 25 01:16:09.002554 amazon-ssm-agent[2143]: 2025-03-25 01:16:08 INFO [CredentialRefresher] Starting credentials refresher loop Mar 25 01:16:09.002554 amazon-ssm-agent[2143]: 2025-03-25 01:16:09 INFO EC2RoleProvider Successfully connected with instance profile role credentials Mar 25 01:16:09.002678 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:16:09.014746 systemd-logind[1939]: New session 1 of user core. Mar 25 01:16:09.044347 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:16:09.053379 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:16:09.082468 (systemd)[2185]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:16:09.088795 systemd-logind[1939]: New session c1 of user core. Mar 25 01:16:09.098770 amazon-ssm-agent[2143]: 2025-03-25 01:16:09 INFO [CredentialRefresher] Next credential rotation will be in 31.3249700646 minutes Mar 25 01:16:09.241563 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:09.246987 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:16:09.260138 (kubelet)[2196]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:16:09.391480 systemd[2185]: Queued start job for default target default.target. Mar 25 01:16:09.400526 systemd[2185]: Created slice app.slice - User Application Slice. Mar 25 01:16:09.400589 systemd[2185]: Reached target paths.target - Paths. Mar 25 01:16:09.400676 systemd[2185]: Reached target timers.target - Timers. Mar 25 01:16:09.403413 systemd[2185]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:16:09.445535 systemd[2185]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:16:09.445788 systemd[2185]: Reached target sockets.target - Sockets. Mar 25 01:16:09.445889 systemd[2185]: Reached target basic.target - Basic System. Mar 25 01:16:09.445973 systemd[2185]: Reached target default.target - Main User Target. Mar 25 01:16:09.446037 systemd[2185]: Startup finished in 342ms. Mar 25 01:16:09.446443 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:16:09.461523 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:16:09.469403 systemd[1]: Startup finished in 1.077s (kernel) + 7.856s (initrd) + 8.882s (userspace) = 17.817s. Mar 25 01:16:09.627698 systemd[1]: Started sshd@1-172.31.24.136:22-147.75.109.163:60300.service - OpenSSH per-connection server daemon (147.75.109.163:60300). Mar 25 01:16:09.834192 sshd[2210]: Accepted publickey for core from 147.75.109.163 port 60300 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:16:09.836822 sshd-session[2210]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:09.847821 systemd-logind[1939]: New session 2 of user core. Mar 25 01:16:09.852548 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:16:09.982130 sshd[2212]: Connection closed by 147.75.109.163 port 60300 Mar 25 01:16:09.983850 sshd-session[2210]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:09.990560 systemd-logind[1939]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:16:09.990563 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:16:09.993963 systemd[1]: sshd@1-172.31.24.136:22-147.75.109.163:60300.service: Deactivated successfully. Mar 25 01:16:10.000129 systemd-logind[1939]: Removed session 2. Mar 25 01:16:10.020713 systemd[1]: Started sshd@2-172.31.24.136:22-147.75.109.163:60316.service - OpenSSH per-connection server daemon (147.75.109.163:60316). Mar 25 01:16:10.053270 amazon-ssm-agent[2143]: 2025-03-25 01:16:10 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Mar 25 01:16:10.154391 amazon-ssm-agent[2143]: 2025-03-25 01:16:10 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2222) started Mar 25 01:16:10.159879 ntpd[1931]: Listen normally on 7 eth0 [fe80::497:cdff:fe1d:55ff%2]:123 Mar 25 01:16:10.162399 ntpd[1931]: 25 Mar 01:16:10 ntpd[1931]: Listen normally on 7 eth0 [fe80::497:cdff:fe1d:55ff%2]:123 Mar 25 01:16:10.277856 amazon-ssm-agent[2143]: 2025-03-25 01:16:10 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Mar 25 01:16:10.318790 sshd[2221]: Accepted publickey for core from 147.75.109.163 port 60316 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:16:10.322111 sshd-session[2221]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:10.342765 systemd-logind[1939]: New session 3 of user core. Mar 25 01:16:10.352516 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:16:10.460339 kubelet[2196]: E0325 01:16:10.460203 2196 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:16:10.465010 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:16:10.465545 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:16:10.466544 systemd[1]: kubelet.service: Consumed 1.321s CPU time, 240.8M memory peak. Mar 25 01:16:10.475301 sshd[2230]: Connection closed by 147.75.109.163 port 60316 Mar 25 01:16:10.476044 sshd-session[2221]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:10.481113 systemd[1]: sshd@2-172.31.24.136:22-147.75.109.163:60316.service: Deactivated successfully. Mar 25 01:16:10.484458 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:16:10.487714 systemd-logind[1939]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:16:10.489826 systemd-logind[1939]: Removed session 3. Mar 25 01:16:10.510916 systemd[1]: Started sshd@3-172.31.24.136:22-147.75.109.163:43970.service - OpenSSH per-connection server daemon (147.75.109.163:43970). Mar 25 01:16:10.699688 sshd[2241]: Accepted publickey for core from 147.75.109.163 port 43970 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:16:10.702698 sshd-session[2241]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:10.710635 systemd-logind[1939]: New session 4 of user core. Mar 25 01:16:10.722510 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:16:10.846286 sshd[2243]: Connection closed by 147.75.109.163 port 43970 Mar 25 01:16:10.847036 sshd-session[2241]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:10.852330 systemd[1]: sshd@3-172.31.24.136:22-147.75.109.163:43970.service: Deactivated successfully. Mar 25 01:16:10.855763 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:16:10.858683 systemd-logind[1939]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:16:10.860388 systemd-logind[1939]: Removed session 4. Mar 25 01:16:10.878511 systemd[1]: Started sshd@4-172.31.24.136:22-147.75.109.163:43986.service - OpenSSH per-connection server daemon (147.75.109.163:43986). Mar 25 01:16:11.068145 sshd[2249]: Accepted publickey for core from 147.75.109.163 port 43986 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:16:11.070928 sshd-session[2249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:11.078890 systemd-logind[1939]: New session 5 of user core. Mar 25 01:16:11.092523 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:16:11.210198 sudo[2253]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:16:11.210925 sudo[2253]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:16:11.234853 sudo[2253]: pam_unix(sudo:session): session closed for user root Mar 25 01:16:11.257264 sshd[2252]: Connection closed by 147.75.109.163 port 43986 Mar 25 01:16:11.258488 sshd-session[2249]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:11.264003 systemd[1]: sshd@4-172.31.24.136:22-147.75.109.163:43986.service: Deactivated successfully. Mar 25 01:16:11.267036 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:16:11.270521 systemd-logind[1939]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:16:11.272592 systemd-logind[1939]: Removed session 5. Mar 25 01:16:11.299685 systemd[1]: Started sshd@5-172.31.24.136:22-147.75.109.163:43990.service - OpenSSH per-connection server daemon (147.75.109.163:43990). Mar 25 01:16:11.499939 sshd[2259]: Accepted publickey for core from 147.75.109.163 port 43990 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:16:11.502482 sshd-session[2259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:11.512320 systemd-logind[1939]: New session 6 of user core. Mar 25 01:16:11.517509 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:16:11.621451 sudo[2263]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:16:11.622042 sudo[2263]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:16:11.629354 sudo[2263]: pam_unix(sudo:session): session closed for user root Mar 25 01:16:11.639562 sudo[2262]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:16:11.640184 sudo[2262]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:16:11.658884 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:16:11.720731 augenrules[2285]: No rules Mar 25 01:16:11.722964 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:16:11.723433 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:16:11.725456 sudo[2262]: pam_unix(sudo:session): session closed for user root Mar 25 01:16:11.748498 sshd[2261]: Connection closed by 147.75.109.163 port 43990 Mar 25 01:16:11.749451 sshd-session[2259]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:11.754492 systemd[1]: sshd@5-172.31.24.136:22-147.75.109.163:43990.service: Deactivated successfully. Mar 25 01:16:11.757719 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:16:11.762349 systemd-logind[1939]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:16:11.764077 systemd-logind[1939]: Removed session 6. Mar 25 01:16:11.781709 systemd[1]: Started sshd@6-172.31.24.136:22-147.75.109.163:44006.service - OpenSSH per-connection server daemon (147.75.109.163:44006). Mar 25 01:16:11.973278 sshd[2294]: Accepted publickey for core from 147.75.109.163 port 44006 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:16:11.975690 sshd-session[2294]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:11.984580 systemd-logind[1939]: New session 7 of user core. Mar 25 01:16:11.991545 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:16:12.093939 sudo[2297]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:16:12.095513 sudo[2297]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:16:12.561750 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:16:12.576824 (dockerd)[2315]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:16:12.914136 dockerd[2315]: time="2025-03-25T01:16:12.913917305Z" level=info msg="Starting up" Mar 25 01:16:12.919929 dockerd[2315]: time="2025-03-25T01:16:12.919853657Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:16:13.015821 dockerd[2315]: time="2025-03-25T01:16:13.015741745Z" level=info msg="Loading containers: start." Mar 25 01:16:13.258669 kernel: Initializing XFRM netlink socket Mar 25 01:16:13.260669 (udev-worker)[2342]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:16:13.377141 systemd-networkd[1869]: docker0: Link UP Mar 25 01:16:13.444529 dockerd[2315]: time="2025-03-25T01:16:13.444461104Z" level=info msg="Loading containers: done." Mar 25 01:16:13.471338 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2128320459-merged.mount: Deactivated successfully. Mar 25 01:16:13.474485 dockerd[2315]: time="2025-03-25T01:16:13.474414628Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:16:13.474627 dockerd[2315]: time="2025-03-25T01:16:13.474582028Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:16:13.474889 dockerd[2315]: time="2025-03-25T01:16:13.474842356Z" level=info msg="Daemon has completed initialization" Mar 25 01:16:13.525116 dockerd[2315]: time="2025-03-25T01:16:13.524541964Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:16:13.524659 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:16:14.665319 containerd[1968]: time="2025-03-25T01:16:14.665260564Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 25 01:16:15.303738 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2308609919.mount: Deactivated successfully. Mar 25 01:16:16.865187 containerd[1968]: time="2025-03-25T01:16:16.865107529Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:16.866909 containerd[1968]: time="2025-03-25T01:16:16.866841058Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=29793524" Mar 25 01:16:16.868740 containerd[1968]: time="2025-03-25T01:16:16.868645495Z" level=info msg="ImageCreate event name:\"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:16.873320 containerd[1968]: time="2025-03-25T01:16:16.873270610Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:16.875578 containerd[1968]: time="2025-03-25T01:16:16.875277350Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"29790324\" in 2.209953901s" Mar 25 01:16:16.875578 containerd[1968]: time="2025-03-25T01:16:16.875334681Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\"" Mar 25 01:16:16.908243 containerd[1968]: time="2025-03-25T01:16:16.908153552Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 25 01:16:18.685514 containerd[1968]: time="2025-03-25T01:16:18.685449664Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:18.687240 containerd[1968]: time="2025-03-25T01:16:18.687149754Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=26861167" Mar 25 01:16:18.688182 containerd[1968]: time="2025-03-25T01:16:18.687779031Z" level=info msg="ImageCreate event name:\"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:18.693270 containerd[1968]: time="2025-03-25T01:16:18.692910748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:18.700272 containerd[1968]: time="2025-03-25T01:16:18.699418021Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"28301963\" in 1.791187048s" Mar 25 01:16:18.700272 containerd[1968]: time="2025-03-25T01:16:18.699487178Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\"" Mar 25 01:16:18.732160 containerd[1968]: time="2025-03-25T01:16:18.732022174Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 25 01:16:19.922296 containerd[1968]: time="2025-03-25T01:16:19.921535980Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:19.923973 containerd[1968]: time="2025-03-25T01:16:19.923895859Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=16264636" Mar 25 01:16:19.926765 containerd[1968]: time="2025-03-25T01:16:19.926713298Z" level=info msg="ImageCreate event name:\"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:19.932323 containerd[1968]: time="2025-03-25T01:16:19.932254298Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:19.933904 containerd[1968]: time="2025-03-25T01:16:19.933692678Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"17705450\" in 1.201610786s" Mar 25 01:16:19.933904 containerd[1968]: time="2025-03-25T01:16:19.933743905Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\"" Mar 25 01:16:19.964250 containerd[1968]: time="2025-03-25T01:16:19.964180047Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 25 01:16:20.564849 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:16:20.568631 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:20.930180 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:20.946838 (kubelet)[2618]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:16:21.083020 kubelet[2618]: E0325 01:16:21.082961 2618 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:16:21.092277 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:16:21.092605 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:16:21.093217 systemd[1]: kubelet.service: Consumed 324ms CPU time, 95M memory peak. Mar 25 01:16:21.408474 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2233120726.mount: Deactivated successfully. Mar 25 01:16:21.910078 containerd[1968]: time="2025-03-25T01:16:21.910009515Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:21.911757 containerd[1968]: time="2025-03-25T01:16:21.911682798Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=25771848" Mar 25 01:16:21.912106 containerd[1968]: time="2025-03-25T01:16:21.912034103Z" level=info msg="ImageCreate event name:\"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:21.914922 containerd[1968]: time="2025-03-25T01:16:21.914858018Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:21.916604 containerd[1968]: time="2025-03-25T01:16:21.916396464Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"25770867\" in 1.952139152s" Mar 25 01:16:21.916604 containerd[1968]: time="2025-03-25T01:16:21.916461952Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\"" Mar 25 01:16:21.946560 containerd[1968]: time="2025-03-25T01:16:21.946386586Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 25 01:16:22.459749 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4205510914.mount: Deactivated successfully. Mar 25 01:16:23.455193 containerd[1968]: time="2025-03-25T01:16:23.455114033Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:23.456865 containerd[1968]: time="2025-03-25T01:16:23.456758243Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485381" Mar 25 01:16:23.458088 containerd[1968]: time="2025-03-25T01:16:23.457571928Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:23.463058 containerd[1968]: time="2025-03-25T01:16:23.462996118Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:23.466129 containerd[1968]: time="2025-03-25T01:16:23.465896927Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.51945247s" Mar 25 01:16:23.466129 containerd[1968]: time="2025-03-25T01:16:23.465974372Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 25 01:16:23.496397 containerd[1968]: time="2025-03-25T01:16:23.496217723Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 25 01:16:23.975633 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount961460519.mount: Deactivated successfully. Mar 25 01:16:23.982943 containerd[1968]: time="2025-03-25T01:16:23.982869381Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:23.985127 containerd[1968]: time="2025-03-25T01:16:23.985011737Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268821" Mar 25 01:16:23.986452 containerd[1968]: time="2025-03-25T01:16:23.986408691Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:23.989450 containerd[1968]: time="2025-03-25T01:16:23.989369206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:23.991280 containerd[1968]: time="2025-03-25T01:16:23.990947220Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 494.6355ms" Mar 25 01:16:23.991280 containerd[1968]: time="2025-03-25T01:16:23.991002513Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Mar 25 01:16:24.022261 containerd[1968]: time="2025-03-25T01:16:24.021997359Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 25 01:16:24.608161 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2067392119.mount: Deactivated successfully. Mar 25 01:16:26.857303 containerd[1968]: time="2025-03-25T01:16:26.856682956Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:26.859095 containerd[1968]: time="2025-03-25T01:16:26.859005810Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=66191472" Mar 25 01:16:26.862430 containerd[1968]: time="2025-03-25T01:16:26.862340622Z" level=info msg="ImageCreate event name:\"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:26.869336 containerd[1968]: time="2025-03-25T01:16:26.869243781Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:26.871651 containerd[1968]: time="2025-03-25T01:16:26.871422095Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"66189079\" in 2.849369468s" Mar 25 01:16:26.871651 containerd[1968]: time="2025-03-25T01:16:26.871488266Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" Mar 25 01:16:31.343168 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:16:31.348533 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:31.687243 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:31.702091 (kubelet)[2832]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:16:31.791355 kubelet[2832]: E0325 01:16:31.791297 2832 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:16:31.796559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:16:31.796872 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:16:31.797443 systemd[1]: kubelet.service: Consumed 290ms CPU time, 96.9M memory peak. Mar 25 01:16:33.568771 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:33.569742 systemd[1]: kubelet.service: Consumed 290ms CPU time, 96.9M memory peak. Mar 25 01:16:33.573786 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:33.619553 systemd[1]: Reload requested from client PID 2847 ('systemctl') (unit session-7.scope)... Mar 25 01:16:33.619579 systemd[1]: Reloading... Mar 25 01:16:33.874276 zram_generator::config[2896]: No configuration found. Mar 25 01:16:34.112846 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:16:34.334935 systemd[1]: Reloading finished in 714 ms. Mar 25 01:16:34.433265 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:34.441756 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:34.445041 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:16:34.445586 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:34.445677 systemd[1]: kubelet.service: Consumed 220ms CPU time, 82.3M memory peak. Mar 25 01:16:34.448917 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:34.734979 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:34.749774 (kubelet)[2958]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:16:34.823988 kubelet[2958]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:34.823988 kubelet[2958]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:16:34.824572 kubelet[2958]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:34.824572 kubelet[2958]: I0325 01:16:34.824157 2958 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:16:36.667492 kubelet[2958]: I0325 01:16:36.667427 2958 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:16:36.667492 kubelet[2958]: I0325 01:16:36.667475 2958 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:16:36.668088 kubelet[2958]: I0325 01:16:36.667804 2958 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:16:36.694635 kubelet[2958]: E0325 01:16:36.694531 2958 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.31.24.136:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.695351 kubelet[2958]: I0325 01:16:36.694879 2958 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:16:36.708902 kubelet[2958]: I0325 01:16:36.708824 2958 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:16:36.709689 kubelet[2958]: I0325 01:16:36.709621 2958 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:16:36.709961 kubelet[2958]: I0325 01:16:36.709675 2958 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-24-136","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:16:36.710143 kubelet[2958]: I0325 01:16:36.709988 2958 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:16:36.710143 kubelet[2958]: I0325 01:16:36.710009 2958 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:16:36.710318 kubelet[2958]: I0325 01:16:36.710254 2958 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:36.713538 kubelet[2958]: I0325 01:16:36.711830 2958 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:16:36.713538 kubelet[2958]: I0325 01:16:36.711909 2958 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:16:36.713538 kubelet[2958]: I0325 01:16:36.712013 2958 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:16:36.713538 kubelet[2958]: I0325 01:16:36.712121 2958 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:16:36.716283 kubelet[2958]: W0325 01:16:36.714056 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.24.136:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.716283 kubelet[2958]: E0325 01:16:36.714143 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.31.24.136:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.716283 kubelet[2958]: W0325 01:16:36.714294 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.24.136:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-24-136&limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.716283 kubelet[2958]: E0325 01:16:36.714351 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.31.24.136:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-24-136&limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.716283 kubelet[2958]: I0325 01:16:36.714526 2958 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:16:36.716283 kubelet[2958]: I0325 01:16:36.714862 2958 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:16:36.716283 kubelet[2958]: W0325 01:16:36.714947 2958 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:16:36.716889 kubelet[2958]: I0325 01:16:36.716861 2958 server.go:1264] "Started kubelet" Mar 25 01:16:36.723469 kubelet[2958]: I0325 01:16:36.723433 2958 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:16:36.725740 kubelet[2958]: E0325 01:16:36.725459 2958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.24.136:6443/api/v1/namespaces/default/events\": dial tcp 172.31.24.136:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-24-136.182fe6d81ea06c84 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-24-136,UID:ip-172-31-24-136,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-24-136,},FirstTimestamp:2025-03-25 01:16:36.716825732 +0000 UTC m=+1.961039229,LastTimestamp:2025-03-25 01:16:36.716825732 +0000 UTC m=+1.961039229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-24-136,}" Mar 25 01:16:36.732302 kubelet[2958]: I0325 01:16:36.732195 2958 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:16:36.733710 kubelet[2958]: I0325 01:16:36.733657 2958 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:16:36.734402 kubelet[2958]: I0325 01:16:36.734369 2958 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:16:36.736156 kubelet[2958]: I0325 01:16:36.736077 2958 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:16:36.737026 kubelet[2958]: I0325 01:16:36.736610 2958 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:16:36.739068 kubelet[2958]: I0325 01:16:36.738875 2958 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:16:36.741275 kubelet[2958]: I0325 01:16:36.740981 2958 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:16:36.742532 kubelet[2958]: W0325 01:16:36.742439 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.24.136:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.742843 kubelet[2958]: E0325 01:16:36.742818 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.31.24.136:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.743264 kubelet[2958]: E0325 01:16:36.743159 2958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.24.136:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-24-136?timeout=10s\": dial tcp 172.31.24.136:6443: connect: connection refused" interval="200ms" Mar 25 01:16:36.748066 kubelet[2958]: I0325 01:16:36.746915 2958 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:16:36.751376 kubelet[2958]: I0325 01:16:36.750211 2958 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:16:36.751376 kubelet[2958]: E0325 01:16:36.750787 2958 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:16:36.756768 kubelet[2958]: I0325 01:16:36.756726 2958 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:16:36.776913 kubelet[2958]: I0325 01:16:36.776858 2958 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:16:36.779296 kubelet[2958]: I0325 01:16:36.779251 2958 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:16:36.779557 kubelet[2958]: I0325 01:16:36.779537 2958 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:16:36.779674 kubelet[2958]: I0325 01:16:36.779655 2958 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:16:36.779854 kubelet[2958]: E0325 01:16:36.779824 2958 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:16:36.800669 kubelet[2958]: W0325 01:16:36.800598 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.24.136:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.800941 kubelet[2958]: E0325 01:16:36.800891 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.31.24.136:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:36.816820 kubelet[2958]: I0325 01:16:36.816790 2958 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:16:36.817006 kubelet[2958]: I0325 01:16:36.816986 2958 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:16:36.817117 kubelet[2958]: I0325 01:16:36.817099 2958 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:36.824806 kubelet[2958]: I0325 01:16:36.824636 2958 policy_none.go:49] "None policy: Start" Mar 25 01:16:36.825973 kubelet[2958]: I0325 01:16:36.825938 2958 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:16:36.826089 kubelet[2958]: I0325 01:16:36.825985 2958 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:16:36.837196 kubelet[2958]: I0325 01:16:36.836450 2958 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-24-136" Mar 25 01:16:36.837196 kubelet[2958]: E0325 01:16:36.836978 2958 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.24.136:6443/api/v1/nodes\": dial tcp 172.31.24.136:6443: connect: connection refused" node="ip-172-31-24-136" Mar 25 01:16:36.842419 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:16:36.858865 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:16:36.866317 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:16:36.876037 kubelet[2958]: I0325 01:16:36.875822 2958 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:16:36.876190 kubelet[2958]: I0325 01:16:36.876132 2958 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:16:36.876395 kubelet[2958]: I0325 01:16:36.876328 2958 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:16:36.881809 kubelet[2958]: E0325 01:16:36.881546 2958 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-24-136\" not found" Mar 25 01:16:36.888451 kubelet[2958]: I0325 01:16:36.888349 2958 topology_manager.go:215] "Topology Admit Handler" podUID="a82e5f913c2f595b25c1bedc2dac1958" podNamespace="kube-system" podName="kube-apiserver-ip-172-31-24-136" Mar 25 01:16:36.891310 kubelet[2958]: I0325 01:16:36.891190 2958 topology_manager.go:215] "Topology Admit Handler" podUID="1e757a8c7417a16a8eec13cb3e1a8edb" podNamespace="kube-system" podName="kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:36.894114 kubelet[2958]: I0325 01:16:36.893696 2958 topology_manager.go:215] "Topology Admit Handler" podUID="9ad67edf5d1068bc163a85f1be0122f0" podNamespace="kube-system" podName="kube-scheduler-ip-172-31-24-136" Mar 25 01:16:36.906752 systemd[1]: Created slice kubepods-burstable-poda82e5f913c2f595b25c1bedc2dac1958.slice - libcontainer container kubepods-burstable-poda82e5f913c2f595b25c1bedc2dac1958.slice. Mar 25 01:16:36.933077 systemd[1]: Created slice kubepods-burstable-pod1e757a8c7417a16a8eec13cb3e1a8edb.slice - libcontainer container kubepods-burstable-pod1e757a8c7417a16a8eec13cb3e1a8edb.slice. Mar 25 01:16:36.942557 kubelet[2958]: I0325 01:16:36.942496 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a82e5f913c2f595b25c1bedc2dac1958-k8s-certs\") pod \"kube-apiserver-ip-172-31-24-136\" (UID: \"a82e5f913c2f595b25c1bedc2dac1958\") " pod="kube-system/kube-apiserver-ip-172-31-24-136" Mar 25 01:16:36.942713 kubelet[2958]: I0325 01:16:36.942561 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a82e5f913c2f595b25c1bedc2dac1958-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-24-136\" (UID: \"a82e5f913c2f595b25c1bedc2dac1958\") " pod="kube-system/kube-apiserver-ip-172-31-24-136" Mar 25 01:16:36.942713 kubelet[2958]: I0325 01:16:36.942608 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-ca-certs\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:36.942713 kubelet[2958]: I0325 01:16:36.942646 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:36.942713 kubelet[2958]: I0325 01:16:36.942682 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-k8s-certs\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:36.942931 kubelet[2958]: I0325 01:16:36.942715 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-kubeconfig\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:36.942931 kubelet[2958]: I0325 01:16:36.942748 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a82e5f913c2f595b25c1bedc2dac1958-ca-certs\") pod \"kube-apiserver-ip-172-31-24-136\" (UID: \"a82e5f913c2f595b25c1bedc2dac1958\") " pod="kube-system/kube-apiserver-ip-172-31-24-136" Mar 25 01:16:36.942931 kubelet[2958]: I0325 01:16:36.942782 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:36.942931 kubelet[2958]: I0325 01:16:36.942819 2958 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9ad67edf5d1068bc163a85f1be0122f0-kubeconfig\") pod \"kube-scheduler-ip-172-31-24-136\" (UID: \"9ad67edf5d1068bc163a85f1be0122f0\") " pod="kube-system/kube-scheduler-ip-172-31-24-136" Mar 25 01:16:36.944319 kubelet[2958]: E0325 01:16:36.944205 2958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.24.136:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-24-136?timeout=10s\": dial tcp 172.31.24.136:6443: connect: connection refused" interval="400ms" Mar 25 01:16:36.945365 systemd[1]: Created slice kubepods-burstable-pod9ad67edf5d1068bc163a85f1be0122f0.slice - libcontainer container kubepods-burstable-pod9ad67edf5d1068bc163a85f1be0122f0.slice. Mar 25 01:16:37.040080 kubelet[2958]: I0325 01:16:37.039519 2958 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-24-136" Mar 25 01:16:37.040080 kubelet[2958]: E0325 01:16:37.039998 2958 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.24.136:6443/api/v1/nodes\": dial tcp 172.31.24.136:6443: connect: connection refused" node="ip-172-31-24-136" Mar 25 01:16:37.229718 containerd[1968]: time="2025-03-25T01:16:37.229562265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-24-136,Uid:a82e5f913c2f595b25c1bedc2dac1958,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:37.242562 containerd[1968]: time="2025-03-25T01:16:37.242504882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-24-136,Uid:1e757a8c7417a16a8eec13cb3e1a8edb,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:37.251459 containerd[1968]: time="2025-03-25T01:16:37.251403243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-24-136,Uid:9ad67edf5d1068bc163a85f1be0122f0,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:37.317065 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 25 01:16:37.345443 kubelet[2958]: E0325 01:16:37.345377 2958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.24.136:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-24-136?timeout=10s\": dial tcp 172.31.24.136:6443: connect: connection refused" interval="800ms" Mar 25 01:16:37.443012 kubelet[2958]: I0325 01:16:37.442474 2958 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-24-136" Mar 25 01:16:37.443012 kubelet[2958]: E0325 01:16:37.442938 2958 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.24.136:6443/api/v1/nodes\": dial tcp 172.31.24.136:6443: connect: connection refused" node="ip-172-31-24-136" Mar 25 01:16:37.673539 kubelet[2958]: W0325 01:16:37.673375 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.24.136:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-24-136&limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:37.673539 kubelet[2958]: E0325 01:16:37.673470 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.31.24.136:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-24-136&limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:37.776952 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2751516869.mount: Deactivated successfully. Mar 25 01:16:37.795311 containerd[1968]: time="2025-03-25T01:16:37.794649669Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:16:37.803579 containerd[1968]: time="2025-03-25T01:16:37.803337235Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 25 01:16:37.806204 containerd[1968]: time="2025-03-25T01:16:37.805287508Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:16:37.808448 containerd[1968]: time="2025-03-25T01:16:37.808270524Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:16:37.812400 containerd[1968]: time="2025-03-25T01:16:37.812316675Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:16:37.814964 containerd[1968]: time="2025-03-25T01:16:37.814880777Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:16:37.816917 containerd[1968]: time="2025-03-25T01:16:37.816505376Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:16:37.817905 kubelet[2958]: W0325 01:16:37.817858 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.24.136:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:37.818133 kubelet[2958]: E0325 01:16:37.818109 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.31.24.136:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:37.819888 containerd[1968]: time="2025-03-25T01:16:37.819784979Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:16:37.822040 containerd[1968]: time="2025-03-25T01:16:37.821187126Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 585.576599ms" Mar 25 01:16:37.824840 containerd[1968]: time="2025-03-25T01:16:37.824321350Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 568.320663ms" Mar 25 01:16:37.831957 containerd[1968]: time="2025-03-25T01:16:37.831749246Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 584.441655ms" Mar 25 01:16:37.906869 containerd[1968]: time="2025-03-25T01:16:37.906758979Z" level=info msg="connecting to shim 6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f" address="unix:///run/containerd/s/bc7db725d8ccd97e812a4a93e6e78a6ec76c06dbd524487685651ddbda811501" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:37.909900 containerd[1968]: time="2025-03-25T01:16:37.909103026Z" level=info msg="connecting to shim af9d90b68aa524235bf0d0d16506a9ecb6e5748d95477805136c40d3172aeb2f" address="unix:///run/containerd/s/5ae36c6e8b67ce5b05884d1be229b3d44eb275bb35cbe0bd28e76bd27804c27c" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:37.932135 containerd[1968]: time="2025-03-25T01:16:37.931174973Z" level=info msg="connecting to shim 5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41" address="unix:///run/containerd/s/c167ececfab706fe70c5d5028368abc32b242f7f8c7c9c284c7b681977adcae8" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:37.977552 systemd[1]: Started cri-containerd-6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f.scope - libcontainer container 6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f. Mar 25 01:16:37.991019 systemd[1]: Started cri-containerd-af9d90b68aa524235bf0d0d16506a9ecb6e5748d95477805136c40d3172aeb2f.scope - libcontainer container af9d90b68aa524235bf0d0d16506a9ecb6e5748d95477805136c40d3172aeb2f. Mar 25 01:16:38.016547 systemd[1]: Started cri-containerd-5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41.scope - libcontainer container 5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41. Mar 25 01:16:38.104974 containerd[1968]: time="2025-03-25T01:16:38.104891248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-24-136,Uid:9ad67edf5d1068bc163a85f1be0122f0,Namespace:kube-system,Attempt:0,} returns sandbox id \"6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f\"" Mar 25 01:16:38.119114 containerd[1968]: time="2025-03-25T01:16:38.119047226Z" level=info msg="CreateContainer within sandbox \"6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:16:38.131257 containerd[1968]: time="2025-03-25T01:16:38.129265779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-24-136,Uid:a82e5f913c2f595b25c1bedc2dac1958,Namespace:kube-system,Attempt:0,} returns sandbox id \"af9d90b68aa524235bf0d0d16506a9ecb6e5748d95477805136c40d3172aeb2f\"" Mar 25 01:16:38.134055 kubelet[2958]: W0325 01:16:38.133924 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.24.136:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:38.134055 kubelet[2958]: E0325 01:16:38.134016 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.31.24.136:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:38.140280 containerd[1968]: time="2025-03-25T01:16:38.140072037Z" level=info msg="CreateContainer within sandbox \"af9d90b68aa524235bf0d0d16506a9ecb6e5748d95477805136c40d3172aeb2f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:16:38.146028 kubelet[2958]: E0325 01:16:38.145969 2958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.24.136:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-24-136?timeout=10s\": dial tcp 172.31.24.136:6443: connect: connection refused" interval="1.6s" Mar 25 01:16:38.152439 containerd[1968]: time="2025-03-25T01:16:38.151594888Z" level=info msg="Container 75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:38.164079 containerd[1968]: time="2025-03-25T01:16:38.164024523Z" level=info msg="Container ef16375b10abbc5349bdb2e054953f34844ce061677b7dc25873af9654dce32f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:38.174640 containerd[1968]: time="2025-03-25T01:16:38.174586235Z" level=info msg="CreateContainer within sandbox \"6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f\"" Mar 25 01:16:38.176245 containerd[1968]: time="2025-03-25T01:16:38.176179758Z" level=info msg="StartContainer for \"75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f\"" Mar 25 01:16:38.178895 containerd[1968]: time="2025-03-25T01:16:38.178842690Z" level=info msg="connecting to shim 75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f" address="unix:///run/containerd/s/bc7db725d8ccd97e812a4a93e6e78a6ec76c06dbd524487685651ddbda811501" protocol=ttrpc version=3 Mar 25 01:16:38.181769 containerd[1968]: time="2025-03-25T01:16:38.181704602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-24-136,Uid:1e757a8c7417a16a8eec13cb3e1a8edb,Namespace:kube-system,Attempt:0,} returns sandbox id \"5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41\"" Mar 25 01:16:38.187779 containerd[1968]: time="2025-03-25T01:16:38.186874076Z" level=info msg="CreateContainer within sandbox \"af9d90b68aa524235bf0d0d16506a9ecb6e5748d95477805136c40d3172aeb2f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ef16375b10abbc5349bdb2e054953f34844ce061677b7dc25873af9654dce32f\"" Mar 25 01:16:38.190287 containerd[1968]: time="2025-03-25T01:16:38.190109901Z" level=info msg="StartContainer for \"ef16375b10abbc5349bdb2e054953f34844ce061677b7dc25873af9654dce32f\"" Mar 25 01:16:38.192292 containerd[1968]: time="2025-03-25T01:16:38.190972078Z" level=info msg="CreateContainer within sandbox \"5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:16:38.193562 containerd[1968]: time="2025-03-25T01:16:38.193458830Z" level=info msg="connecting to shim ef16375b10abbc5349bdb2e054953f34844ce061677b7dc25873af9654dce32f" address="unix:///run/containerd/s/5ae36c6e8b67ce5b05884d1be229b3d44eb275bb35cbe0bd28e76bd27804c27c" protocol=ttrpc version=3 Mar 25 01:16:38.219566 containerd[1968]: time="2025-03-25T01:16:38.219348467Z" level=info msg="Container 47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:38.231530 systemd[1]: Started cri-containerd-75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f.scope - libcontainer container 75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f. Mar 25 01:16:38.239461 containerd[1968]: time="2025-03-25T01:16:38.239349769Z" level=info msg="CreateContainer within sandbox \"5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341\"" Mar 25 01:16:38.243932 containerd[1968]: time="2025-03-25T01:16:38.240532438Z" level=info msg="StartContainer for \"47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341\"" Mar 25 01:16:38.243591 systemd[1]: Started cri-containerd-ef16375b10abbc5349bdb2e054953f34844ce061677b7dc25873af9654dce32f.scope - libcontainer container ef16375b10abbc5349bdb2e054953f34844ce061677b7dc25873af9654dce32f. Mar 25 01:16:38.245066 containerd[1968]: time="2025-03-25T01:16:38.244619357Z" level=info msg="connecting to shim 47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341" address="unix:///run/containerd/s/c167ececfab706fe70c5d5028368abc32b242f7f8c7c9c284c7b681977adcae8" protocol=ttrpc version=3 Mar 25 01:16:38.250260 kubelet[2958]: I0325 01:16:38.248465 2958 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-24-136" Mar 25 01:16:38.251045 kubelet[2958]: E0325 01:16:38.250946 2958 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.24.136:6443/api/v1/nodes\": dial tcp 172.31.24.136:6443: connect: connection refused" node="ip-172-31-24-136" Mar 25 01:16:38.265021 kubelet[2958]: W0325 01:16:38.264938 2958 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.24.136:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:38.265021 kubelet[2958]: E0325 01:16:38.265033 2958 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.31.24.136:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.24.136:6443: connect: connection refused Mar 25 01:16:38.305637 systemd[1]: Started cri-containerd-47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341.scope - libcontainer container 47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341. Mar 25 01:16:38.396108 containerd[1968]: time="2025-03-25T01:16:38.395462139Z" level=info msg="StartContainer for \"75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f\" returns successfully" Mar 25 01:16:38.401622 containerd[1968]: time="2025-03-25T01:16:38.401555978Z" level=info msg="StartContainer for \"ef16375b10abbc5349bdb2e054953f34844ce061677b7dc25873af9654dce32f\" returns successfully" Mar 25 01:16:38.471405 containerd[1968]: time="2025-03-25T01:16:38.470015368Z" level=info msg="StartContainer for \"47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341\" returns successfully" Mar 25 01:16:39.857549 kubelet[2958]: I0325 01:16:39.854220 2958 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-24-136" Mar 25 01:16:41.659061 kubelet[2958]: E0325 01:16:41.658972 2958 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-24-136\" not found" node="ip-172-31-24-136" Mar 25 01:16:41.714449 kubelet[2958]: I0325 01:16:41.714378 2958 kubelet_node_status.go:76] "Successfully registered node" node="ip-172-31-24-136" Mar 25 01:16:41.717959 kubelet[2958]: I0325 01:16:41.717634 2958 apiserver.go:52] "Watching apiserver" Mar 25 01:16:41.739316 kubelet[2958]: I0325 01:16:41.739221 2958 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:16:43.714746 systemd[1]: Reload requested from client PID 3230 ('systemctl') (unit session-7.scope)... Mar 25 01:16:43.714771 systemd[1]: Reloading... Mar 25 01:16:43.921277 zram_generator::config[3281]: No configuration found. Mar 25 01:16:44.159691 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:16:44.416176 systemd[1]: Reloading finished in 700 ms. Mar 25 01:16:44.454542 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:44.468949 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:16:44.469459 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:44.469561 systemd[1]: kubelet.service: Consumed 2.660s CPU time, 114.1M memory peak. Mar 25 01:16:44.472994 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:44.788560 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:44.809269 (kubelet)[3335]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:16:44.916050 kubelet[3335]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:44.916050 kubelet[3335]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:16:44.916050 kubelet[3335]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:44.916050 kubelet[3335]: I0325 01:16:44.915580 3335 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:16:44.923904 kubelet[3335]: I0325 01:16:44.923844 3335 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:16:44.923904 kubelet[3335]: I0325 01:16:44.923888 3335 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:16:44.924462 kubelet[3335]: I0325 01:16:44.924407 3335 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:16:44.927699 kubelet[3335]: I0325 01:16:44.927657 3335 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:16:44.932306 kubelet[3335]: I0325 01:16:44.932212 3335 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:16:44.949019 kubelet[3335]: I0325 01:16:44.948964 3335 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:16:44.949523 kubelet[3335]: I0325 01:16:44.949459 3335 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:16:44.949832 kubelet[3335]: I0325 01:16:44.949515 3335 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-24-136","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:16:44.949981 kubelet[3335]: I0325 01:16:44.949838 3335 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:16:44.949981 kubelet[3335]: I0325 01:16:44.949858 3335 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:16:44.949981 kubelet[3335]: I0325 01:16:44.949916 3335 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:44.950198 kubelet[3335]: I0325 01:16:44.950182 3335 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:16:44.951300 kubelet[3335]: I0325 01:16:44.950925 3335 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:16:44.951300 kubelet[3335]: I0325 01:16:44.950999 3335 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:16:44.951300 kubelet[3335]: I0325 01:16:44.951044 3335 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:16:44.956872 kubelet[3335]: I0325 01:16:44.954496 3335 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:16:44.956872 kubelet[3335]: I0325 01:16:44.954790 3335 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:16:44.956872 kubelet[3335]: I0325 01:16:44.955519 3335 server.go:1264] "Started kubelet" Mar 25 01:16:44.965594 kubelet[3335]: I0325 01:16:44.965543 3335 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:16:44.974419 kubelet[3335]: I0325 01:16:44.974329 3335 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:16:44.978756 kubelet[3335]: I0325 01:16:44.978689 3335 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:16:44.982273 kubelet[3335]: I0325 01:16:44.982127 3335 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:16:44.982600 kubelet[3335]: I0325 01:16:44.982557 3335 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:16:45.002469 kubelet[3335]: I0325 01:16:45.002426 3335 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:16:45.026397 kubelet[3335]: I0325 01:16:45.026331 3335 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:16:45.026548 kubelet[3335]: I0325 01:16:45.026518 3335 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:16:45.030944 kubelet[3335]: I0325 01:16:45.002765 3335 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:16:45.031237 kubelet[3335]: I0325 01:16:45.031181 3335 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:16:45.035271 kubelet[3335]: I0325 01:16:45.034386 3335 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:16:45.039968 kubelet[3335]: E0325 01:16:45.039391 3335 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:16:45.061264 kubelet[3335]: I0325 01:16:45.060821 3335 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:16:45.063493 kubelet[3335]: I0325 01:16:45.063405 3335 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:16:45.063686 kubelet[3335]: I0325 01:16:45.063502 3335 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:16:45.063686 kubelet[3335]: I0325 01:16:45.063535 3335 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:16:45.063686 kubelet[3335]: E0325 01:16:45.063622 3335 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:16:45.116932 kubelet[3335]: I0325 01:16:45.116879 3335 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-24-136" Mar 25 01:16:45.144050 kubelet[3335]: I0325 01:16:45.143560 3335 kubelet_node_status.go:112] "Node was previously registered" node="ip-172-31-24-136" Mar 25 01:16:45.144050 kubelet[3335]: I0325 01:16:45.143683 3335 kubelet_node_status.go:76] "Successfully registered node" node="ip-172-31-24-136" Mar 25 01:16:45.163731 kubelet[3335]: E0325 01:16:45.163674 3335 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 25 01:16:45.180037 kubelet[3335]: I0325 01:16:45.179980 3335 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:16:45.180037 kubelet[3335]: I0325 01:16:45.180015 3335 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:16:45.180299 kubelet[3335]: I0325 01:16:45.180055 3335 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:45.180736 kubelet[3335]: I0325 01:16:45.180541 3335 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:16:45.180736 kubelet[3335]: I0325 01:16:45.180581 3335 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:16:45.180736 kubelet[3335]: I0325 01:16:45.180621 3335 policy_none.go:49] "None policy: Start" Mar 25 01:16:45.182310 kubelet[3335]: I0325 01:16:45.182257 3335 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:16:45.182444 kubelet[3335]: I0325 01:16:45.182319 3335 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:16:45.182908 kubelet[3335]: I0325 01:16:45.182593 3335 state_mem.go:75] "Updated machine memory state" Mar 25 01:16:45.196657 kubelet[3335]: I0325 01:16:45.195811 3335 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:16:45.196657 kubelet[3335]: I0325 01:16:45.196108 3335 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:16:45.197923 kubelet[3335]: I0325 01:16:45.197206 3335 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:16:45.365437 kubelet[3335]: I0325 01:16:45.364570 3335 topology_manager.go:215] "Topology Admit Handler" podUID="a82e5f913c2f595b25c1bedc2dac1958" podNamespace="kube-system" podName="kube-apiserver-ip-172-31-24-136" Mar 25 01:16:45.365437 kubelet[3335]: I0325 01:16:45.364793 3335 topology_manager.go:215] "Topology Admit Handler" podUID="1e757a8c7417a16a8eec13cb3e1a8edb" podNamespace="kube-system" podName="kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:45.367485 kubelet[3335]: I0325 01:16:45.367250 3335 topology_manager.go:215] "Topology Admit Handler" podUID="9ad67edf5d1068bc163a85f1be0122f0" podNamespace="kube-system" podName="kube-scheduler-ip-172-31-24-136" Mar 25 01:16:45.377703 kubelet[3335]: E0325 01:16:45.377574 3335 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ip-172-31-24-136\" already exists" pod="kube-system/kube-scheduler-ip-172-31-24-136" Mar 25 01:16:45.445483 kubelet[3335]: I0325 01:16:45.445418 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-kubeconfig\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:45.445617 kubelet[3335]: I0325 01:16:45.445499 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:45.445617 kubelet[3335]: I0325 01:16:45.445547 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-ca-certs\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:45.445617 kubelet[3335]: I0325 01:16:45.445585 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:45.445804 kubelet[3335]: I0325 01:16:45.445623 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1e757a8c7417a16a8eec13cb3e1a8edb-k8s-certs\") pod \"kube-controller-manager-ip-172-31-24-136\" (UID: \"1e757a8c7417a16a8eec13cb3e1a8edb\") " pod="kube-system/kube-controller-manager-ip-172-31-24-136" Mar 25 01:16:45.445804 kubelet[3335]: I0325 01:16:45.445657 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9ad67edf5d1068bc163a85f1be0122f0-kubeconfig\") pod \"kube-scheduler-ip-172-31-24-136\" (UID: \"9ad67edf5d1068bc163a85f1be0122f0\") " pod="kube-system/kube-scheduler-ip-172-31-24-136" Mar 25 01:16:45.445804 kubelet[3335]: I0325 01:16:45.445691 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a82e5f913c2f595b25c1bedc2dac1958-ca-certs\") pod \"kube-apiserver-ip-172-31-24-136\" (UID: \"a82e5f913c2f595b25c1bedc2dac1958\") " pod="kube-system/kube-apiserver-ip-172-31-24-136" Mar 25 01:16:45.445804 kubelet[3335]: I0325 01:16:45.445723 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a82e5f913c2f595b25c1bedc2dac1958-k8s-certs\") pod \"kube-apiserver-ip-172-31-24-136\" (UID: \"a82e5f913c2f595b25c1bedc2dac1958\") " pod="kube-system/kube-apiserver-ip-172-31-24-136" Mar 25 01:16:45.445804 kubelet[3335]: I0325 01:16:45.445759 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a82e5f913c2f595b25c1bedc2dac1958-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-24-136\" (UID: \"a82e5f913c2f595b25c1bedc2dac1958\") " pod="kube-system/kube-apiserver-ip-172-31-24-136" Mar 25 01:16:45.964262 kubelet[3335]: I0325 01:16:45.963028 3335 apiserver.go:52] "Watching apiserver" Mar 25 01:16:46.031679 kubelet[3335]: I0325 01:16:46.031599 3335 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:16:46.182847 kubelet[3335]: E0325 01:16:46.182777 3335 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ip-172-31-24-136\" already exists" pod="kube-system/kube-apiserver-ip-172-31-24-136" Mar 25 01:16:46.383151 kubelet[3335]: I0325 01:16:46.382050 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-24-136" podStartSLOduration=1.382026582 podStartE2EDuration="1.382026582s" podCreationTimestamp="2025-03-25 01:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:46.337144507 +0000 UTC m=+1.518575195" watchObservedRunningTime="2025-03-25 01:16:46.382026582 +0000 UTC m=+1.563457246" Mar 25 01:16:46.437628 kubelet[3335]: I0325 01:16:46.437555 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-24-136" podStartSLOduration=1.437533975 podStartE2EDuration="1.437533975s" podCreationTimestamp="2025-03-25 01:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:46.382516202 +0000 UTC m=+1.563946866" watchObservedRunningTime="2025-03-25 01:16:46.437533975 +0000 UTC m=+1.618964651" Mar 25 01:16:46.438264 kubelet[3335]: I0325 01:16:46.438017 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-24-136" podStartSLOduration=2.438004488 podStartE2EDuration="2.438004488s" podCreationTimestamp="2025-03-25 01:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:46.432084201 +0000 UTC m=+1.613514853" watchObservedRunningTime="2025-03-25 01:16:46.438004488 +0000 UTC m=+1.619435152" Mar 25 01:16:51.658373 update_engine[1941]: I20250325 01:16:51.658285 1941 update_attempter.cc:509] Updating boot flags... Mar 25 01:16:51.791434 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (3414) Mar 25 01:16:51.933342 sudo[2297]: pam_unix(sudo:session): session closed for user root Mar 25 01:16:51.959679 sshd[2296]: Connection closed by 147.75.109.163 port 44006 Mar 25 01:16:51.961512 sshd-session[2294]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:51.984549 systemd[1]: sshd@6-172.31.24.136:22-147.75.109.163:44006.service: Deactivated successfully. Mar 25 01:16:51.996350 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:16:51.996731 systemd[1]: session-7.scope: Consumed 10.070s CPU time, 246.6M memory peak. Mar 25 01:16:52.002954 systemd-logind[1939]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:16:52.009023 systemd-logind[1939]: Removed session 7. Mar 25 01:16:52.233393 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (3404) Mar 25 01:16:59.666050 kubelet[3335]: I0325 01:16:59.665994 3335 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:16:59.667626 containerd[1968]: time="2025-03-25T01:16:59.666987990Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:16:59.668208 kubelet[3335]: I0325 01:16:59.667350 3335 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:17:00.386801 kubelet[3335]: I0325 01:17:00.386700 3335 topology_manager.go:215] "Topology Admit Handler" podUID="ea7c6eb3-5767-4410-a9c1-c5223020f64d" podNamespace="kube-system" podName="kube-proxy-cqk67" Mar 25 01:17:00.418750 systemd[1]: Created slice kubepods-besteffort-podea7c6eb3_5767_4410_a9c1_c5223020f64d.slice - libcontainer container kubepods-besteffort-podea7c6eb3_5767_4410_a9c1_c5223020f64d.slice. Mar 25 01:17:00.447776 kubelet[3335]: I0325 01:17:00.447615 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea7c6eb3-5767-4410-a9c1-c5223020f64d-lib-modules\") pod \"kube-proxy-cqk67\" (UID: \"ea7c6eb3-5767-4410-a9c1-c5223020f64d\") " pod="kube-system/kube-proxy-cqk67" Mar 25 01:17:00.447776 kubelet[3335]: I0325 01:17:00.447688 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ea7c6eb3-5767-4410-a9c1-c5223020f64d-kube-proxy\") pod \"kube-proxy-cqk67\" (UID: \"ea7c6eb3-5767-4410-a9c1-c5223020f64d\") " pod="kube-system/kube-proxy-cqk67" Mar 25 01:17:00.447776 kubelet[3335]: I0325 01:17:00.447730 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ea7c6eb3-5767-4410-a9c1-c5223020f64d-xtables-lock\") pod \"kube-proxy-cqk67\" (UID: \"ea7c6eb3-5767-4410-a9c1-c5223020f64d\") " pod="kube-system/kube-proxy-cqk67" Mar 25 01:17:00.447776 kubelet[3335]: I0325 01:17:00.447767 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgwd\" (UniqueName: \"kubernetes.io/projected/ea7c6eb3-5767-4410-a9c1-c5223020f64d-kube-api-access-jrgwd\") pod \"kube-proxy-cqk67\" (UID: \"ea7c6eb3-5767-4410-a9c1-c5223020f64d\") " pod="kube-system/kube-proxy-cqk67" Mar 25 01:17:00.727699 kubelet[3335]: I0325 01:17:00.727623 3335 topology_manager.go:215] "Topology Admit Handler" podUID="9ece08dd-9791-48fc-9ce7-a0af0209323e" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-6rjl6" Mar 25 01:17:00.736558 containerd[1968]: time="2025-03-25T01:17:00.736428098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cqk67,Uid:ea7c6eb3-5767-4410-a9c1-c5223020f64d,Namespace:kube-system,Attempt:0,}" Mar 25 01:17:00.749784 systemd[1]: Created slice kubepods-besteffort-pod9ece08dd_9791_48fc_9ce7_a0af0209323e.slice - libcontainer container kubepods-besteffort-pod9ece08dd_9791_48fc_9ce7_a0af0209323e.slice. Mar 25 01:17:00.751821 kubelet[3335]: I0325 01:17:00.750063 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t98nh\" (UniqueName: \"kubernetes.io/projected/9ece08dd-9791-48fc-9ce7-a0af0209323e-kube-api-access-t98nh\") pod \"tigera-operator-6479d6dc54-6rjl6\" (UID: \"9ece08dd-9791-48fc-9ce7-a0af0209323e\") " pod="tigera-operator/tigera-operator-6479d6dc54-6rjl6" Mar 25 01:17:00.751821 kubelet[3335]: I0325 01:17:00.750135 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9ece08dd-9791-48fc-9ce7-a0af0209323e-var-lib-calico\") pod \"tigera-operator-6479d6dc54-6rjl6\" (UID: \"9ece08dd-9791-48fc-9ce7-a0af0209323e\") " pod="tigera-operator/tigera-operator-6479d6dc54-6rjl6" Mar 25 01:17:00.791843 containerd[1968]: time="2025-03-25T01:17:00.791187521Z" level=info msg="connecting to shim 814de34b8c91bf71614e6b63806a6930dc6f72edb4d8edaa6575b8295c0fef1f" address="unix:///run/containerd/s/a1267fbc100dadfc8d297c83a7e4061995e90f133afbb79ce9471e3e8326d4d8" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:00.839548 systemd[1]: Started cri-containerd-814de34b8c91bf71614e6b63806a6930dc6f72edb4d8edaa6575b8295c0fef1f.scope - libcontainer container 814de34b8c91bf71614e6b63806a6930dc6f72edb4d8edaa6575b8295c0fef1f. Mar 25 01:17:00.901729 containerd[1968]: time="2025-03-25T01:17:00.901531158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cqk67,Uid:ea7c6eb3-5767-4410-a9c1-c5223020f64d,Namespace:kube-system,Attempt:0,} returns sandbox id \"814de34b8c91bf71614e6b63806a6930dc6f72edb4d8edaa6575b8295c0fef1f\"" Mar 25 01:17:00.911539 containerd[1968]: time="2025-03-25T01:17:00.911438657Z" level=info msg="CreateContainer within sandbox \"814de34b8c91bf71614e6b63806a6930dc6f72edb4d8edaa6575b8295c0fef1f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:17:00.934366 containerd[1968]: time="2025-03-25T01:17:00.934280199Z" level=info msg="Container 814cb5db0da4df0e1a0ce2bf39aac34b50f14149969320a1c44a6a0d46c75a41: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:00.954553 containerd[1968]: time="2025-03-25T01:17:00.954478347Z" level=info msg="CreateContainer within sandbox \"814de34b8c91bf71614e6b63806a6930dc6f72edb4d8edaa6575b8295c0fef1f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"814cb5db0da4df0e1a0ce2bf39aac34b50f14149969320a1c44a6a0d46c75a41\"" Mar 25 01:17:00.955890 containerd[1968]: time="2025-03-25T01:17:00.955827876Z" level=info msg="StartContainer for \"814cb5db0da4df0e1a0ce2bf39aac34b50f14149969320a1c44a6a0d46c75a41\"" Mar 25 01:17:00.959705 containerd[1968]: time="2025-03-25T01:17:00.959482040Z" level=info msg="connecting to shim 814cb5db0da4df0e1a0ce2bf39aac34b50f14149969320a1c44a6a0d46c75a41" address="unix:///run/containerd/s/a1267fbc100dadfc8d297c83a7e4061995e90f133afbb79ce9471e3e8326d4d8" protocol=ttrpc version=3 Mar 25 01:17:00.993538 systemd[1]: Started cri-containerd-814cb5db0da4df0e1a0ce2bf39aac34b50f14149969320a1c44a6a0d46c75a41.scope - libcontainer container 814cb5db0da4df0e1a0ce2bf39aac34b50f14149969320a1c44a6a0d46c75a41. Mar 25 01:17:01.065258 containerd[1968]: time="2025-03-25T01:17:01.063844533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-6rjl6,Uid:9ece08dd-9791-48fc-9ce7-a0af0209323e,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:17:01.079927 containerd[1968]: time="2025-03-25T01:17:01.079852181Z" level=info msg="StartContainer for \"814cb5db0da4df0e1a0ce2bf39aac34b50f14149969320a1c44a6a0d46c75a41\" returns successfully" Mar 25 01:17:01.110794 containerd[1968]: time="2025-03-25T01:17:01.110684233Z" level=info msg="connecting to shim 693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d" address="unix:///run/containerd/s/76ecd458d15578f8f4c19d37b27293806a43ac28d75ae7d843fb201245eabb95" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:01.166983 systemd[1]: Started cri-containerd-693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d.scope - libcontainer container 693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d. Mar 25 01:17:01.306380 containerd[1968]: time="2025-03-25T01:17:01.303969055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-6rjl6,Uid:9ece08dd-9791-48fc-9ce7-a0af0209323e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d\"" Mar 25 01:17:01.314646 containerd[1968]: time="2025-03-25T01:17:01.314027428Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:17:05.446598 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3566008117.mount: Deactivated successfully. Mar 25 01:17:06.042539 containerd[1968]: time="2025-03-25T01:17:06.042475315Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:06.043944 containerd[1968]: time="2025-03-25T01:17:06.043850295Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 25 01:17:06.045127 containerd[1968]: time="2025-03-25T01:17:06.045055057Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:06.048443 containerd[1968]: time="2025-03-25T01:17:06.048397628Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:06.050341 containerd[1968]: time="2025-03-25T01:17:06.049854564Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 4.735197883s" Mar 25 01:17:06.050341 containerd[1968]: time="2025-03-25T01:17:06.049901808Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 25 01:17:06.059003 containerd[1968]: time="2025-03-25T01:17:06.058599173Z" level=info msg="CreateContainer within sandbox \"693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:17:06.075311 containerd[1968]: time="2025-03-25T01:17:06.075257796Z" level=info msg="Container 420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:06.079748 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1926014713.mount: Deactivated successfully. Mar 25 01:17:06.096439 containerd[1968]: time="2025-03-25T01:17:06.096386882Z" level=info msg="CreateContainer within sandbox \"693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b\"" Mar 25 01:17:06.097832 containerd[1968]: time="2025-03-25T01:17:06.097782036Z" level=info msg="StartContainer for \"420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b\"" Mar 25 01:17:06.100330 containerd[1968]: time="2025-03-25T01:17:06.100278323Z" level=info msg="connecting to shim 420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b" address="unix:///run/containerd/s/76ecd458d15578f8f4c19d37b27293806a43ac28d75ae7d843fb201245eabb95" protocol=ttrpc version=3 Mar 25 01:17:06.140536 systemd[1]: Started cri-containerd-420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b.scope - libcontainer container 420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b. Mar 25 01:17:06.202103 containerd[1968]: time="2025-03-25T01:17:06.202054022Z" level=info msg="StartContainer for \"420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b\" returns successfully" Mar 25 01:17:07.220970 kubelet[3335]: I0325 01:17:07.220323 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cqk67" podStartSLOduration=7.220300847 podStartE2EDuration="7.220300847s" podCreationTimestamp="2025-03-25 01:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:17:01.204500354 +0000 UTC m=+16.385931018" watchObservedRunningTime="2025-03-25 01:17:07.220300847 +0000 UTC m=+22.401731511" Mar 25 01:17:07.220970 kubelet[3335]: I0325 01:17:07.220532 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-6rjl6" podStartSLOduration=2.480434959 podStartE2EDuration="7.220520553s" podCreationTimestamp="2025-03-25 01:17:00 +0000 UTC" firstStartedPulling="2025-03-25 01:17:01.312440729 +0000 UTC m=+16.493871393" lastFinishedPulling="2025-03-25 01:17:06.052526335 +0000 UTC m=+21.233956987" observedRunningTime="2025-03-25 01:17:07.220271186 +0000 UTC m=+22.401701886" watchObservedRunningTime="2025-03-25 01:17:07.220520553 +0000 UTC m=+22.401951241" Mar 25 01:17:12.083663 kubelet[3335]: I0325 01:17:12.083599 3335 topology_manager.go:215] "Topology Admit Handler" podUID="ad101308-e1b4-412b-b285-dc6185f35bf4" podNamespace="calico-system" podName="calico-typha-68bdb8667f-pvjmf" Mar 25 01:17:12.108620 systemd[1]: Created slice kubepods-besteffort-podad101308_e1b4_412b_b285_dc6185f35bf4.slice - libcontainer container kubepods-besteffort-podad101308_e1b4_412b_b285_dc6185f35bf4.slice. Mar 25 01:17:12.125408 kubelet[3335]: I0325 01:17:12.125318 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zsz\" (UniqueName: \"kubernetes.io/projected/ad101308-e1b4-412b-b285-dc6185f35bf4-kube-api-access-b4zsz\") pod \"calico-typha-68bdb8667f-pvjmf\" (UID: \"ad101308-e1b4-412b-b285-dc6185f35bf4\") " pod="calico-system/calico-typha-68bdb8667f-pvjmf" Mar 25 01:17:12.125408 kubelet[3335]: I0325 01:17:12.125399 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad101308-e1b4-412b-b285-dc6185f35bf4-tigera-ca-bundle\") pod \"calico-typha-68bdb8667f-pvjmf\" (UID: \"ad101308-e1b4-412b-b285-dc6185f35bf4\") " pod="calico-system/calico-typha-68bdb8667f-pvjmf" Mar 25 01:17:12.125694 kubelet[3335]: I0325 01:17:12.125441 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ad101308-e1b4-412b-b285-dc6185f35bf4-typha-certs\") pod \"calico-typha-68bdb8667f-pvjmf\" (UID: \"ad101308-e1b4-412b-b285-dc6185f35bf4\") " pod="calico-system/calico-typha-68bdb8667f-pvjmf" Mar 25 01:17:12.327259 kubelet[3335]: I0325 01:17:12.327174 3335 topology_manager.go:215] "Topology Admit Handler" podUID="3ba8e38e-1fc4-47b0-af6a-4170645f2206" podNamespace="calico-system" podName="calico-node-z69gs" Mar 25 01:17:12.344140 systemd[1]: Created slice kubepods-besteffort-pod3ba8e38e_1fc4_47b0_af6a_4170645f2206.slice - libcontainer container kubepods-besteffort-pod3ba8e38e_1fc4_47b0_af6a_4170645f2206.slice. Mar 25 01:17:12.418539 containerd[1968]: time="2025-03-25T01:17:12.418457236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68bdb8667f-pvjmf,Uid:ad101308-e1b4-412b-b285-dc6185f35bf4,Namespace:calico-system,Attempt:0,}" Mar 25 01:17:12.428075 kubelet[3335]: I0325 01:17:12.427818 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3ba8e38e-1fc4-47b0-af6a-4170645f2206-node-certs\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.428075 kubelet[3335]: I0325 01:17:12.427890 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-cni-bin-dir\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.428075 kubelet[3335]: I0325 01:17:12.427931 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-xtables-lock\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.428075 kubelet[3335]: I0325 01:17:12.427966 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-cni-net-dir\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.429197 kubelet[3335]: I0325 01:17:12.428935 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-lib-modules\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.431586 kubelet[3335]: I0325 01:17:12.429310 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-cni-log-dir\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.431586 kubelet[3335]: I0325 01:17:12.429465 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-policysync\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.431586 kubelet[3335]: I0325 01:17:12.429556 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-var-run-calico\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.431586 kubelet[3335]: I0325 01:17:12.429739 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-flexvol-driver-host\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.431586 kubelet[3335]: I0325 01:17:12.429809 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf8d\" (UniqueName: \"kubernetes.io/projected/3ba8e38e-1fc4-47b0-af6a-4170645f2206-kube-api-access-zjf8d\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.431892 kubelet[3335]: I0325 01:17:12.429862 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ba8e38e-1fc4-47b0-af6a-4170645f2206-tigera-ca-bundle\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.431892 kubelet[3335]: I0325 01:17:12.429901 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3ba8e38e-1fc4-47b0-af6a-4170645f2206-var-lib-calico\") pod \"calico-node-z69gs\" (UID: \"3ba8e38e-1fc4-47b0-af6a-4170645f2206\") " pod="calico-system/calico-node-z69gs" Mar 25 01:17:12.467559 containerd[1968]: time="2025-03-25T01:17:12.467266855Z" level=info msg="connecting to shim cb29f4832c965db67618c57acbeb6fee37035c04e0df8cc1f19eea814042bfab" address="unix:///run/containerd/s/601c602584eb6cd8ec870bc2a52d2e102aadc77dab70d66d8f13211717fe1bbc" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:12.541926 kubelet[3335]: E0325 01:17:12.541870 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.541926 kubelet[3335]: W0325 01:17:12.541917 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.542280 kubelet[3335]: E0325 01:17:12.541964 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.542861 systemd[1]: Started cri-containerd-cb29f4832c965db67618c57acbeb6fee37035c04e0df8cc1f19eea814042bfab.scope - libcontainer container cb29f4832c965db67618c57acbeb6fee37035c04e0df8cc1f19eea814042bfab. Mar 25 01:17:12.545525 kubelet[3335]: E0325 01:17:12.545470 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.545525 kubelet[3335]: W0325 01:17:12.545509 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.545722 kubelet[3335]: E0325 01:17:12.545544 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.552746 kubelet[3335]: E0325 01:17:12.549306 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.552746 kubelet[3335]: W0325 01:17:12.549855 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.552746 kubelet[3335]: E0325 01:17:12.549923 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.552746 kubelet[3335]: E0325 01:17:12.551718 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.552746 kubelet[3335]: W0325 01:17:12.551746 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.552746 kubelet[3335]: E0325 01:17:12.551860 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.560012 kubelet[3335]: E0325 01:17:12.557007 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.560012 kubelet[3335]: W0325 01:17:12.557047 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.560012 kubelet[3335]: E0325 01:17:12.557081 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.592481 kubelet[3335]: I0325 01:17:12.592398 3335 topology_manager.go:215] "Topology Admit Handler" podUID="292343f5-324d-4913-9d89-7e7c3530b2f1" podNamespace="calico-system" podName="csi-node-driver-rcpm6" Mar 25 01:17:12.593093 kubelet[3335]: E0325 01:17:12.592857 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rcpm6" podUID="292343f5-324d-4913-9d89-7e7c3530b2f1" Mar 25 01:17:12.606045 kubelet[3335]: E0325 01:17:12.605882 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.606045 kubelet[3335]: W0325 01:17:12.605930 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.606045 kubelet[3335]: E0325 01:17:12.605969 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.611379 kubelet[3335]: E0325 01:17:12.611321 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.611379 kubelet[3335]: W0325 01:17:12.611361 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.611780 kubelet[3335]: E0325 01:17:12.611395 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.611871 kubelet[3335]: E0325 01:17:12.611844 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.611871 kubelet[3335]: W0325 01:17:12.611864 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.611979 kubelet[3335]: E0325 01:17:12.611887 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.613262 kubelet[3335]: E0325 01:17:12.612440 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.613262 kubelet[3335]: W0325 01:17:12.612474 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.613262 kubelet[3335]: E0325 01:17:12.612535 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.613262 kubelet[3335]: E0325 01:17:12.613027 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.613262 kubelet[3335]: W0325 01:17:12.613046 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.613262 kubelet[3335]: E0325 01:17:12.613068 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.613654 kubelet[3335]: E0325 01:17:12.613552 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.613654 kubelet[3335]: W0325 01:17:12.613600 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.613654 kubelet[3335]: E0325 01:17:12.613622 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.616110 kubelet[3335]: E0325 01:17:12.614200 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.616110 kubelet[3335]: W0325 01:17:12.615193 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.616110 kubelet[3335]: E0325 01:17:12.615318 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.616110 kubelet[3335]: E0325 01:17:12.615867 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.616110 kubelet[3335]: W0325 01:17:12.615890 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.616110 kubelet[3335]: E0325 01:17:12.615945 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.617691 kubelet[3335]: E0325 01:17:12.617612 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.617691 kubelet[3335]: W0325 01:17:12.617678 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.617891 kubelet[3335]: E0325 01:17:12.617711 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.618855 kubelet[3335]: E0325 01:17:12.618316 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.618855 kubelet[3335]: W0325 01:17:12.618377 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.618855 kubelet[3335]: E0325 01:17:12.618407 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.619129 kubelet[3335]: E0325 01:17:12.619109 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.619183 kubelet[3335]: W0325 01:17:12.619134 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.619183 kubelet[3335]: E0325 01:17:12.619162 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.620332 kubelet[3335]: E0325 01:17:12.619629 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.620332 kubelet[3335]: W0325 01:17:12.619664 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.620332 kubelet[3335]: E0325 01:17:12.619693 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.620332 kubelet[3335]: E0325 01:17:12.620087 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.620332 kubelet[3335]: W0325 01:17:12.620106 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.620332 kubelet[3335]: E0325 01:17:12.620128 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.620722 kubelet[3335]: E0325 01:17:12.620612 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.620722 kubelet[3335]: W0325 01:17:12.620631 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.620722 kubelet[3335]: E0325 01:17:12.620654 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.621246 kubelet[3335]: E0325 01:17:12.620993 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.621246 kubelet[3335]: W0325 01:17:12.621032 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.621246 kubelet[3335]: E0325 01:17:12.621057 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.622087 kubelet[3335]: E0325 01:17:12.622047 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.622189 kubelet[3335]: W0325 01:17:12.622074 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.622189 kubelet[3335]: E0325 01:17:12.622127 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.622572 kubelet[3335]: E0325 01:17:12.622556 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.622627 kubelet[3335]: W0325 01:17:12.622576 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.622627 kubelet[3335]: E0325 01:17:12.622601 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.623427 kubelet[3335]: E0325 01:17:12.622989 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.623427 kubelet[3335]: W0325 01:17:12.623026 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.623427 kubelet[3335]: E0325 01:17:12.623051 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.624420 kubelet[3335]: E0325 01:17:12.623475 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.624420 kubelet[3335]: W0325 01:17:12.623496 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.624420 kubelet[3335]: E0325 01:17:12.623520 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.624420 kubelet[3335]: E0325 01:17:12.623958 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.624420 kubelet[3335]: W0325 01:17:12.623979 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.624420 kubelet[3335]: E0325 01:17:12.624002 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.624420 kubelet[3335]: E0325 01:17:12.624343 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.624420 kubelet[3335]: W0325 01:17:12.624361 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.624420 kubelet[3335]: E0325 01:17:12.624381 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.634164 kubelet[3335]: E0325 01:17:12.633924 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.634164 kubelet[3335]: W0325 01:17:12.633959 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.634164 kubelet[3335]: E0325 01:17:12.633998 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.634164 kubelet[3335]: I0325 01:17:12.634037 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/292343f5-324d-4913-9d89-7e7c3530b2f1-kubelet-dir\") pod \"csi-node-driver-rcpm6\" (UID: \"292343f5-324d-4913-9d89-7e7c3530b2f1\") " pod="calico-system/csi-node-driver-rcpm6" Mar 25 01:17:12.634903 kubelet[3335]: E0325 01:17:12.634451 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.634903 kubelet[3335]: W0325 01:17:12.634473 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.634903 kubelet[3335]: E0325 01:17:12.634513 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.634903 kubelet[3335]: I0325 01:17:12.634550 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc5c6\" (UniqueName: \"kubernetes.io/projected/292343f5-324d-4913-9d89-7e7c3530b2f1-kube-api-access-rc5c6\") pod \"csi-node-driver-rcpm6\" (UID: \"292343f5-324d-4913-9d89-7e7c3530b2f1\") " pod="calico-system/csi-node-driver-rcpm6" Mar 25 01:17:12.635494 kubelet[3335]: E0325 01:17:12.634971 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.635494 kubelet[3335]: W0325 01:17:12.634988 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.635494 kubelet[3335]: E0325 01:17:12.635024 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.636277 kubelet[3335]: E0325 01:17:12.635958 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.636277 kubelet[3335]: W0325 01:17:12.635994 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.636277 kubelet[3335]: E0325 01:17:12.636049 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.636529 kubelet[3335]: E0325 01:17:12.636502 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.636529 kubelet[3335]: W0325 01:17:12.636522 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.636623 kubelet[3335]: E0325 01:17:12.636555 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.636623 kubelet[3335]: I0325 01:17:12.636594 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/292343f5-324d-4913-9d89-7e7c3530b2f1-varrun\") pod \"csi-node-driver-rcpm6\" (UID: \"292343f5-324d-4913-9d89-7e7c3530b2f1\") " pod="calico-system/csi-node-driver-rcpm6" Mar 25 01:17:12.639285 kubelet[3335]: E0325 01:17:12.638370 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.639285 kubelet[3335]: W0325 01:17:12.638411 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.639285 kubelet[3335]: E0325 01:17:12.638627 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.639285 kubelet[3335]: I0325 01:17:12.638674 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/292343f5-324d-4913-9d89-7e7c3530b2f1-socket-dir\") pod \"csi-node-driver-rcpm6\" (UID: \"292343f5-324d-4913-9d89-7e7c3530b2f1\") " pod="calico-system/csi-node-driver-rcpm6" Mar 25 01:17:12.639285 kubelet[3335]: E0325 01:17:12.638941 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.639285 kubelet[3335]: W0325 01:17:12.638957 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.639285 kubelet[3335]: E0325 01:17:12.639106 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.639792 kubelet[3335]: E0325 01:17:12.639451 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.639792 kubelet[3335]: W0325 01:17:12.639472 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.640487 kubelet[3335]: E0325 01:17:12.640439 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.640795 kubelet[3335]: E0325 01:17:12.640751 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.640795 kubelet[3335]: W0325 01:17:12.640784 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.641166 kubelet[3335]: E0325 01:17:12.640825 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.641166 kubelet[3335]: I0325 01:17:12.640868 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/292343f5-324d-4913-9d89-7e7c3530b2f1-registration-dir\") pod \"csi-node-driver-rcpm6\" (UID: \"292343f5-324d-4913-9d89-7e7c3530b2f1\") " pod="calico-system/csi-node-driver-rcpm6" Mar 25 01:17:12.642276 kubelet[3335]: E0325 01:17:12.641848 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.642276 kubelet[3335]: W0325 01:17:12.641880 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.642276 kubelet[3335]: E0325 01:17:12.641913 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.643888 kubelet[3335]: E0325 01:17:12.643579 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.643888 kubelet[3335]: W0325 01:17:12.643614 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.643888 kubelet[3335]: E0325 01:17:12.643647 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.644699 kubelet[3335]: E0325 01:17:12.644511 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.645457 kubelet[3335]: W0325 01:17:12.645412 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.646022 kubelet[3335]: E0325 01:17:12.645653 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.647427 kubelet[3335]: E0325 01:17:12.647389 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.647880 kubelet[3335]: W0325 01:17:12.647613 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.647880 kubelet[3335]: E0325 01:17:12.647654 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.648523 kubelet[3335]: E0325 01:17:12.648375 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.649329 kubelet[3335]: W0325 01:17:12.649283 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.649507 kubelet[3335]: E0325 01:17:12.649483 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.650420 kubelet[3335]: E0325 01:17:12.650279 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.650902 kubelet[3335]: W0325 01:17:12.650665 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.650902 kubelet[3335]: E0325 01:17:12.650713 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.653442 containerd[1968]: time="2025-03-25T01:17:12.653195949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z69gs,Uid:3ba8e38e-1fc4-47b0-af6a-4170645f2206,Namespace:calico-system,Attempt:0,}" Mar 25 01:17:12.714493 containerd[1968]: time="2025-03-25T01:17:12.714324509Z" level=info msg="connecting to shim 9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b" address="unix:///run/containerd/s/53ba7a4ddaaacb4726b3bdbb39274c1d125b5bbf8783f466c0a4b7314b24c1bc" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:12.752382 kubelet[3335]: E0325 01:17:12.752173 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.752382 kubelet[3335]: W0325 01:17:12.752211 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.752382 kubelet[3335]: E0325 01:17:12.752277 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.755889 kubelet[3335]: E0325 01:17:12.755637 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.755889 kubelet[3335]: W0325 01:17:12.755676 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.755889 kubelet[3335]: E0325 01:17:12.755719 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.758374 kubelet[3335]: E0325 01:17:12.758316 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.758642 kubelet[3335]: W0325 01:17:12.758584 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.758780 kubelet[3335]: E0325 01:17:12.758734 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.760666 kubelet[3335]: E0325 01:17:12.760617 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.760666 kubelet[3335]: W0325 01:17:12.760655 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.760985 kubelet[3335]: E0325 01:17:12.760941 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.762909 kubelet[3335]: E0325 01:17:12.762843 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.762909 kubelet[3335]: W0325 01:17:12.762895 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.763615 kubelet[3335]: E0325 01:17:12.763198 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.765420 kubelet[3335]: E0325 01:17:12.765372 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.765420 kubelet[3335]: W0325 01:17:12.765415 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.765849 kubelet[3335]: E0325 01:17:12.765504 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.766204 kubelet[3335]: E0325 01:17:12.766007 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.766204 kubelet[3335]: W0325 01:17:12.766036 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.766204 kubelet[3335]: E0325 01:17:12.766115 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.767791 kubelet[3335]: E0325 01:17:12.767736 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.767791 kubelet[3335]: W0325 01:17:12.767772 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.767791 kubelet[3335]: E0325 01:17:12.767841 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.770319 kubelet[3335]: E0325 01:17:12.769604 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.770319 kubelet[3335]: W0325 01:17:12.769640 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.770710 kubelet[3335]: E0325 01:17:12.770521 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.770710 kubelet[3335]: E0325 01:17:12.770584 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.770710 kubelet[3335]: W0325 01:17:12.770605 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.771991 kubelet[3335]: E0325 01:17:12.771307 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.772464 kubelet[3335]: E0325 01:17:12.772417 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.772464 kubelet[3335]: W0325 01:17:12.772454 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.772844 kubelet[3335]: E0325 01:17:12.772618 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.775659 kubelet[3335]: E0325 01:17:12.775596 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.775659 kubelet[3335]: W0325 01:17:12.775633 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.778500 kubelet[3335]: E0325 01:17:12.776713 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.778500 kubelet[3335]: W0325 01:17:12.776741 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.778500 kubelet[3335]: E0325 01:17:12.777923 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.778500 kubelet[3335]: W0325 01:17:12.777948 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.779190 kubelet[3335]: E0325 01:17:12.779005 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.779190 kubelet[3335]: E0325 01:17:12.779059 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.779190 kubelet[3335]: E0325 01:17:12.779106 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.779417 kubelet[3335]: E0325 01:17:12.779367 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.779417 kubelet[3335]: W0325 01:17:12.779386 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.781399 kubelet[3335]: E0325 01:17:12.781347 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.781399 kubelet[3335]: W0325 01:17:12.781385 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.782583 systemd[1]: Started cri-containerd-9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b.scope - libcontainer container 9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b. Mar 25 01:17:12.784401 kubelet[3335]: E0325 01:17:12.784338 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.784401 kubelet[3335]: W0325 01:17:12.784385 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.786815 kubelet[3335]: E0325 01:17:12.786764 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.786815 kubelet[3335]: W0325 01:17:12.786802 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.788215 kubelet[3335]: E0325 01:17:12.788166 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.788215 kubelet[3335]: W0325 01:17:12.788204 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.788215 kubelet[3335]: E0325 01:17:12.788276 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.788623 kubelet[3335]: E0325 01:17:12.788341 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.789867 kubelet[3335]: E0325 01:17:12.789633 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.789867 kubelet[3335]: E0325 01:17:12.789771 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.789867 kubelet[3335]: E0325 01:17:12.789798 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.790341 kubelet[3335]: E0325 01:17:12.789933 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.790341 kubelet[3335]: W0325 01:17:12.789953 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.791584 kubelet[3335]: E0325 01:17:12.791298 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.791929 kubelet[3335]: E0325 01:17:12.791856 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.791929 kubelet[3335]: W0325 01:17:12.791880 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.791929 kubelet[3335]: E0325 01:17:12.791915 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.792521 kubelet[3335]: E0325 01:17:12.792478 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.792521 kubelet[3335]: W0325 01:17:12.792509 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.792758 kubelet[3335]: E0325 01:17:12.792548 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.796160 kubelet[3335]: E0325 01:17:12.796088 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.796160 kubelet[3335]: W0325 01:17:12.796124 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.797446 kubelet[3335]: E0325 01:17:12.797260 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.799654 kubelet[3335]: E0325 01:17:12.799605 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.799654 kubelet[3335]: W0325 01:17:12.799641 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.800070 kubelet[3335]: E0325 01:17:12.799676 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.802871 kubelet[3335]: E0325 01:17:12.802822 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.802871 kubelet[3335]: W0325 01:17:12.802858 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.803088 kubelet[3335]: E0325 01:17:12.802891 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.835016 kubelet[3335]: E0325 01:17:12.834885 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:12.835016 kubelet[3335]: W0325 01:17:12.834919 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:12.835016 kubelet[3335]: E0325 01:17:12.834948 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:12.894951 containerd[1968]: time="2025-03-25T01:17:12.894887259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68bdb8667f-pvjmf,Uid:ad101308-e1b4-412b-b285-dc6185f35bf4,Namespace:calico-system,Attempt:0,} returns sandbox id \"cb29f4832c965db67618c57acbeb6fee37035c04e0df8cc1f19eea814042bfab\"" Mar 25 01:17:12.908847 containerd[1968]: time="2025-03-25T01:17:12.908487292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:17:12.928095 containerd[1968]: time="2025-03-25T01:17:12.927941236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z69gs,Uid:3ba8e38e-1fc4-47b0-af6a-4170645f2206,Namespace:calico-system,Attempt:0,} returns sandbox id \"9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b\"" Mar 25 01:17:14.071493 kubelet[3335]: E0325 01:17:14.065475 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rcpm6" podUID="292343f5-324d-4913-9d89-7e7c3530b2f1" Mar 25 01:17:15.005121 containerd[1968]: time="2025-03-25T01:17:15.004796148Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:15.006299 containerd[1968]: time="2025-03-25T01:17:15.006186588Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 25 01:17:15.007433 containerd[1968]: time="2025-03-25T01:17:15.007344753Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:15.010952 containerd[1968]: time="2025-03-25T01:17:15.010835319Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:15.012477 containerd[1968]: time="2025-03-25T01:17:15.012168512Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 2.103615661s" Mar 25 01:17:15.012477 containerd[1968]: time="2025-03-25T01:17:15.012297387Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 25 01:17:15.015351 containerd[1968]: time="2025-03-25T01:17:15.014986910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:17:15.043177 containerd[1968]: time="2025-03-25T01:17:15.043090387Z" level=info msg="CreateContainer within sandbox \"cb29f4832c965db67618c57acbeb6fee37035c04e0df8cc1f19eea814042bfab\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:17:15.062944 containerd[1968]: time="2025-03-25T01:17:15.059591181Z" level=info msg="Container 745ed8c2c5f48a2808110b13206f6aa2c7c4740df34327d1a61d20bed11bc375: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:15.080058 containerd[1968]: time="2025-03-25T01:17:15.079947685Z" level=info msg="CreateContainer within sandbox \"cb29f4832c965db67618c57acbeb6fee37035c04e0df8cc1f19eea814042bfab\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"745ed8c2c5f48a2808110b13206f6aa2c7c4740df34327d1a61d20bed11bc375\"" Mar 25 01:17:15.082682 containerd[1968]: time="2025-03-25T01:17:15.080953706Z" level=info msg="StartContainer for \"745ed8c2c5f48a2808110b13206f6aa2c7c4740df34327d1a61d20bed11bc375\"" Mar 25 01:17:15.084863 containerd[1968]: time="2025-03-25T01:17:15.084730880Z" level=info msg="connecting to shim 745ed8c2c5f48a2808110b13206f6aa2c7c4740df34327d1a61d20bed11bc375" address="unix:///run/containerd/s/601c602584eb6cd8ec870bc2a52d2e102aadc77dab70d66d8f13211717fe1bbc" protocol=ttrpc version=3 Mar 25 01:17:15.134520 systemd[1]: Started cri-containerd-745ed8c2c5f48a2808110b13206f6aa2c7c4740df34327d1a61d20bed11bc375.scope - libcontainer container 745ed8c2c5f48a2808110b13206f6aa2c7c4740df34327d1a61d20bed11bc375. Mar 25 01:17:15.230951 containerd[1968]: time="2025-03-25T01:17:15.230845459Z" level=info msg="StartContainer for \"745ed8c2c5f48a2808110b13206f6aa2c7c4740df34327d1a61d20bed11bc375\" returns successfully" Mar 25 01:17:15.346146 kubelet[3335]: E0325 01:17:15.345989 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.346146 kubelet[3335]: W0325 01:17:15.346031 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.346146 kubelet[3335]: E0325 01:17:15.346098 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.349142 kubelet[3335]: E0325 01:17:15.347624 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.349142 kubelet[3335]: W0325 01:17:15.347651 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.349142 kubelet[3335]: E0325 01:17:15.347709 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.349142 kubelet[3335]: E0325 01:17:15.349058 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.349142 kubelet[3335]: W0325 01:17:15.349084 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.349142 kubelet[3335]: E0325 01:17:15.349113 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.350435 kubelet[3335]: E0325 01:17:15.350386 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.350567 kubelet[3335]: W0325 01:17:15.350469 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.350567 kubelet[3335]: E0325 01:17:15.350501 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.352474 kubelet[3335]: E0325 01:17:15.351952 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.352474 kubelet[3335]: W0325 01:17:15.351988 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.352474 kubelet[3335]: E0325 01:17:15.352022 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.352474 kubelet[3335]: E0325 01:17:15.353335 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.352474 kubelet[3335]: W0325 01:17:15.353364 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.352474 kubelet[3335]: E0325 01:17:15.353440 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.352474 kubelet[3335]: E0325 01:17:15.353995 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.352474 kubelet[3335]: W0325 01:17:15.354060 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.352474 kubelet[3335]: E0325 01:17:15.354168 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.357405 kubelet[3335]: E0325 01:17:15.354901 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.357405 kubelet[3335]: W0325 01:17:15.354961 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.357405 kubelet[3335]: E0325 01:17:15.354988 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.357405 kubelet[3335]: E0325 01:17:15.355480 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.357405 kubelet[3335]: W0325 01:17:15.355498 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.357405 kubelet[3335]: E0325 01:17:15.355542 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.357405 kubelet[3335]: E0325 01:17:15.355888 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.357405 kubelet[3335]: W0325 01:17:15.355906 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.357405 kubelet[3335]: E0325 01:17:15.355924 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.357405 kubelet[3335]: E0325 01:17:15.356299 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.357948 kubelet[3335]: W0325 01:17:15.356315 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.357948 kubelet[3335]: E0325 01:17:15.356334 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.357948 kubelet[3335]: E0325 01:17:15.356709 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.357948 kubelet[3335]: W0325 01:17:15.356727 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.357948 kubelet[3335]: E0325 01:17:15.356745 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.357948 kubelet[3335]: E0325 01:17:15.357088 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.357948 kubelet[3335]: W0325 01:17:15.357104 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.357948 kubelet[3335]: E0325 01:17:15.357123 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.357948 kubelet[3335]: E0325 01:17:15.357641 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.357948 kubelet[3335]: W0325 01:17:15.357662 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.360839 kubelet[3335]: E0325 01:17:15.357710 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.362135 kubelet[3335]: E0325 01:17:15.362084 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.362135 kubelet[3335]: W0325 01:17:15.362121 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.362338 kubelet[3335]: E0325 01:17:15.362154 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.393459 kubelet[3335]: E0325 01:17:15.393405 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.394346 kubelet[3335]: W0325 01:17:15.394281 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.394470 kubelet[3335]: E0325 01:17:15.394367 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.395211 kubelet[3335]: E0325 01:17:15.395090 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.395211 kubelet[3335]: W0325 01:17:15.395154 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.395211 kubelet[3335]: E0325 01:17:15.395200 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.396252 kubelet[3335]: E0325 01:17:15.395972 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.396252 kubelet[3335]: W0325 01:17:15.396009 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.396478 kubelet[3335]: E0325 01:17:15.396300 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.397640 kubelet[3335]: E0325 01:17:15.397589 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.397640 kubelet[3335]: W0325 01:17:15.397627 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.398055 kubelet[3335]: E0325 01:17:15.398013 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.398521 kubelet[3335]: E0325 01:17:15.398478 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.399754 kubelet[3335]: W0325 01:17:15.398511 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.399754 kubelet[3335]: E0325 01:17:15.399406 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.400377 kubelet[3335]: E0325 01:17:15.400323 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.400377 kubelet[3335]: W0325 01:17:15.400361 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.400672 kubelet[3335]: E0325 01:17:15.400602 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.402335 kubelet[3335]: E0325 01:17:15.401660 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.402335 kubelet[3335]: W0325 01:17:15.402327 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.403313 kubelet[3335]: E0325 01:17:15.402589 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.403746 kubelet[3335]: E0325 01:17:15.403700 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.403746 kubelet[3335]: W0325 01:17:15.403736 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.404798 kubelet[3335]: E0325 01:17:15.404332 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.405274 kubelet[3335]: E0325 01:17:15.404990 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.405274 kubelet[3335]: W0325 01:17:15.405025 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.405274 kubelet[3335]: E0325 01:17:15.405066 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.405677 kubelet[3335]: E0325 01:17:15.405634 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.405677 kubelet[3335]: W0325 01:17:15.405665 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.406406 kubelet[3335]: E0325 01:17:15.406334 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.406525 kubelet[3335]: E0325 01:17:15.406491 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.406631 kubelet[3335]: W0325 01:17:15.406519 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.407283 kubelet[3335]: E0325 01:17:15.406811 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.407283 kubelet[3335]: E0325 01:17:15.407276 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.407470 kubelet[3335]: W0325 01:17:15.407299 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.407854 kubelet[3335]: E0325 01:17:15.407818 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.409642 kubelet[3335]: E0325 01:17:15.409592 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.409642 kubelet[3335]: W0325 01:17:15.409629 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.409875 kubelet[3335]: E0325 01:17:15.409671 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.410157 kubelet[3335]: E0325 01:17:15.410121 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.410157 kubelet[3335]: W0325 01:17:15.410150 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.410309 kubelet[3335]: E0325 01:17:15.410188 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.410716 kubelet[3335]: E0325 01:17:15.410690 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.410956 kubelet[3335]: W0325 01:17:15.410841 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.410956 kubelet[3335]: E0325 01:17:15.410889 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.411362 kubelet[3335]: E0325 01:17:15.411321 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.411362 kubelet[3335]: W0325 01:17:15.411354 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.411502 kubelet[3335]: E0325 01:17:15.411393 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.412636 kubelet[3335]: E0325 01:17:15.412585 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.412636 kubelet[3335]: W0325 01:17:15.412637 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.412850 kubelet[3335]: E0325 01:17:15.412671 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:15.417494 kubelet[3335]: E0325 01:17:15.417411 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:15.417494 kubelet[3335]: W0325 01:17:15.417470 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:15.417826 kubelet[3335]: E0325 01:17:15.417504 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.064533 kubelet[3335]: E0325 01:17:16.064433 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rcpm6" podUID="292343f5-324d-4913-9d89-7e7c3530b2f1" Mar 25 01:17:16.267705 kubelet[3335]: E0325 01:17:16.267588 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.267705 kubelet[3335]: W0325 01:17:16.267623 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.267705 kubelet[3335]: E0325 01:17:16.267654 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.268041 kubelet[3335]: E0325 01:17:16.268011 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.268164 kubelet[3335]: W0325 01:17:16.268039 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.268164 kubelet[3335]: E0325 01:17:16.268063 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.268400 kubelet[3335]: E0325 01:17:16.268377 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.268505 kubelet[3335]: W0325 01:17:16.268398 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.268505 kubelet[3335]: E0325 01:17:16.268418 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.268841 kubelet[3335]: E0325 01:17:16.268805 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.268841 kubelet[3335]: W0325 01:17:16.268831 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.269004 kubelet[3335]: E0325 01:17:16.268854 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.269179 kubelet[3335]: E0325 01:17:16.269153 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.269274 kubelet[3335]: W0325 01:17:16.269178 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.269274 kubelet[3335]: E0325 01:17:16.269199 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.269520 kubelet[3335]: E0325 01:17:16.269495 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.269594 kubelet[3335]: W0325 01:17:16.269519 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.269594 kubelet[3335]: E0325 01:17:16.269540 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.269883 kubelet[3335]: E0325 01:17:16.269852 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.269945 kubelet[3335]: W0325 01:17:16.269877 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.270033 kubelet[3335]: E0325 01:17:16.269993 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.271413 kubelet[3335]: E0325 01:17:16.271306 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.271413 kubelet[3335]: W0325 01:17:16.271343 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.271413 kubelet[3335]: E0325 01:17:16.271376 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.274967 kubelet[3335]: I0325 01:17:16.272594 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-68bdb8667f-pvjmf" podStartSLOduration=2.162223006 podStartE2EDuration="4.272571331s" podCreationTimestamp="2025-03-25 01:17:12 +0000 UTC" firstStartedPulling="2025-03-25 01:17:12.90353311 +0000 UTC m=+28.084963762" lastFinishedPulling="2025-03-25 01:17:15.013881423 +0000 UTC m=+30.195312087" observedRunningTime="2025-03-25 01:17:15.281808433 +0000 UTC m=+30.463239109" watchObservedRunningTime="2025-03-25 01:17:16.272571331 +0000 UTC m=+31.454002007" Mar 25 01:17:16.274967 kubelet[3335]: E0325 01:17:16.273099 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.274967 kubelet[3335]: W0325 01:17:16.273122 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.274967 kubelet[3335]: E0325 01:17:16.273148 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.274967 kubelet[3335]: E0325 01:17:16.273610 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.274967 kubelet[3335]: W0325 01:17:16.273631 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.274967 kubelet[3335]: E0325 01:17:16.273655 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.274967 kubelet[3335]: E0325 01:17:16.274462 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.275590 kubelet[3335]: W0325 01:17:16.274489 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.275590 kubelet[3335]: E0325 01:17:16.274520 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.275590 kubelet[3335]: E0325 01:17:16.274918 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.275590 kubelet[3335]: W0325 01:17:16.274941 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.275590 kubelet[3335]: E0325 01:17:16.274965 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.275590 kubelet[3335]: E0325 01:17:16.275313 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.275590 kubelet[3335]: W0325 01:17:16.275331 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.275590 kubelet[3335]: E0325 01:17:16.275351 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.275956 kubelet[3335]: E0325 01:17:16.275727 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.275956 kubelet[3335]: W0325 01:17:16.275743 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.275956 kubelet[3335]: E0325 01:17:16.275765 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.276100 kubelet[3335]: E0325 01:17:16.276056 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.276100 kubelet[3335]: W0325 01:17:16.276071 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.276100 kubelet[3335]: E0325 01:17:16.276090 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.308806 kubelet[3335]: E0325 01:17:16.308557 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.308806 kubelet[3335]: W0325 01:17:16.308591 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.308806 kubelet[3335]: E0325 01:17:16.308619 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.310115 kubelet[3335]: E0325 01:17:16.309937 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.310115 kubelet[3335]: W0325 01:17:16.309968 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.310115 kubelet[3335]: E0325 01:17:16.310018 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.311886 kubelet[3335]: E0325 01:17:16.311852 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.312307 kubelet[3335]: W0325 01:17:16.312211 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.312901 kubelet[3335]: E0325 01:17:16.312852 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.314038 kubelet[3335]: E0325 01:17:16.313929 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.314038 kubelet[3335]: W0325 01:17:16.313958 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.314038 kubelet[3335]: E0325 01:17:16.313995 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.315212 kubelet[3335]: E0325 01:17:16.314793 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.315212 kubelet[3335]: W0325 01:17:16.314827 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.315212 kubelet[3335]: E0325 01:17:16.314909 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.317403 kubelet[3335]: E0325 01:17:16.315371 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.317403 kubelet[3335]: W0325 01:17:16.315423 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.318447 kubelet[3335]: E0325 01:17:16.317200 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.318447 kubelet[3335]: E0325 01:17:16.318050 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.318447 kubelet[3335]: W0325 01:17:16.318214 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.318841 kubelet[3335]: E0325 01:17:16.318776 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.321569 kubelet[3335]: E0325 01:17:16.321407 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.321569 kubelet[3335]: W0325 01:17:16.321441 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.321569 kubelet[3335]: E0325 01:17:16.321557 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.322450 kubelet[3335]: E0325 01:17:16.322274 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.322450 kubelet[3335]: W0325 01:17:16.322301 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.322751 kubelet[3335]: E0325 01:17:16.322518 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.323632 kubelet[3335]: E0325 01:17:16.323002 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.323632 kubelet[3335]: W0325 01:17:16.323106 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.323632 kubelet[3335]: E0325 01:17:16.323489 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.325674 kubelet[3335]: E0325 01:17:16.325301 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.325674 kubelet[3335]: W0325 01:17:16.325496 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.326286 kubelet[3335]: E0325 01:17:16.325629 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.327043 kubelet[3335]: E0325 01:17:16.326767 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.327043 kubelet[3335]: W0325 01:17:16.326799 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.327043 kubelet[3335]: E0325 01:17:16.326897 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.327385 kubelet[3335]: E0325 01:17:16.327353 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.327470 kubelet[3335]: W0325 01:17:16.327383 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.327773 kubelet[3335]: E0325 01:17:16.327592 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.328133 kubelet[3335]: E0325 01:17:16.327848 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.328133 kubelet[3335]: W0325 01:17:16.327886 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.328553 kubelet[3335]: E0325 01:17:16.328507 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.328637 kubelet[3335]: W0325 01:17:16.328567 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.328637 kubelet[3335]: E0325 01:17:16.328600 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.329376 kubelet[3335]: E0325 01:17:16.328979 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.329727 kubelet[3335]: E0325 01:17:16.329678 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.329727 kubelet[3335]: W0325 01:17:16.329711 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.329850 kubelet[3335]: E0325 01:17:16.329748 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.330145 kubelet[3335]: E0325 01:17:16.330119 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.330811 kubelet[3335]: W0325 01:17:16.330145 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.330811 kubelet[3335]: E0325 01:17:16.330380 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.331316 kubelet[3335]: E0325 01:17:16.331273 3335 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:17:16.331316 kubelet[3335]: W0325 01:17:16.331307 3335 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:17:16.331453 kubelet[3335]: E0325 01:17:16.331360 3335 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:17:16.551571 containerd[1968]: time="2025-03-25T01:17:16.551468233Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:16.553280 containerd[1968]: time="2025-03-25T01:17:16.553170014Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 25 01:17:16.555397 containerd[1968]: time="2025-03-25T01:17:16.555322025Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:16.559731 containerd[1968]: time="2025-03-25T01:17:16.559632249Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:16.561314 containerd[1968]: time="2025-03-25T01:17:16.561081124Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.54603217s" Mar 25 01:17:16.561314 containerd[1968]: time="2025-03-25T01:17:16.561134018Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 25 01:17:16.566482 containerd[1968]: time="2025-03-25T01:17:16.566297843Z" level=info msg="CreateContainer within sandbox \"9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:17:16.587667 containerd[1968]: time="2025-03-25T01:17:16.586469892Z" level=info msg="Container f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:16.610265 containerd[1968]: time="2025-03-25T01:17:16.609332111Z" level=info msg="CreateContainer within sandbox \"9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1\"" Mar 25 01:17:16.612207 containerd[1968]: time="2025-03-25T01:17:16.612142893Z" level=info msg="StartContainer for \"f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1\"" Mar 25 01:17:16.615094 containerd[1968]: time="2025-03-25T01:17:16.615013801Z" level=info msg="connecting to shim f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1" address="unix:///run/containerd/s/53ba7a4ddaaacb4726b3bdbb39274c1d125b5bbf8783f466c0a4b7314b24c1bc" protocol=ttrpc version=3 Mar 25 01:17:16.658570 systemd[1]: Started cri-containerd-f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1.scope - libcontainer container f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1. Mar 25 01:17:16.741871 containerd[1968]: time="2025-03-25T01:17:16.741788858Z" level=info msg="StartContainer for \"f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1\" returns successfully" Mar 25 01:17:16.764745 systemd[1]: cri-containerd-f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1.scope: Deactivated successfully. Mar 25 01:17:16.772292 containerd[1968]: time="2025-03-25T01:17:16.772118590Z" level=info msg="received exit event container_id:\"f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1\" id:\"f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1\" pid:4202 exited_at:{seconds:1742865436 nanos:771621414}" Mar 25 01:17:16.772292 containerd[1968]: time="2025-03-25T01:17:16.772184880Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1\" id:\"f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1\" pid:4202 exited_at:{seconds:1742865436 nanos:771621414}" Mar 25 01:17:16.810727 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f8ba28601943d5a8e75882d222e36d44d6cd03341d219fac2ab9303b5f1587e1-rootfs.mount: Deactivated successfully. Mar 25 01:17:17.264346 containerd[1968]: time="2025-03-25T01:17:17.263804153Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:17:18.065535 kubelet[3335]: E0325 01:17:18.064389 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rcpm6" podUID="292343f5-324d-4913-9d89-7e7c3530b2f1" Mar 25 01:17:20.064052 kubelet[3335]: E0325 01:17:20.063966 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rcpm6" podUID="292343f5-324d-4913-9d89-7e7c3530b2f1" Mar 25 01:17:20.829063 containerd[1968]: time="2025-03-25T01:17:20.829009090Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:20.830863 containerd[1968]: time="2025-03-25T01:17:20.830788496Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 25 01:17:20.831190 containerd[1968]: time="2025-03-25T01:17:20.830996663Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:20.836079 containerd[1968]: time="2025-03-25T01:17:20.835921532Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:20.837145 containerd[1968]: time="2025-03-25T01:17:20.836743565Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 3.572876934s" Mar 25 01:17:20.837145 containerd[1968]: time="2025-03-25T01:17:20.836796458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 25 01:17:20.841184 containerd[1968]: time="2025-03-25T01:17:20.840870328Z" level=info msg="CreateContainer within sandbox \"9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:17:20.854265 containerd[1968]: time="2025-03-25T01:17:20.852791669Z" level=info msg="Container 3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:20.870010 containerd[1968]: time="2025-03-25T01:17:20.869879700Z" level=info msg="CreateContainer within sandbox \"9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0\"" Mar 25 01:17:20.870813 containerd[1968]: time="2025-03-25T01:17:20.870668162Z" level=info msg="StartContainer for \"3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0\"" Mar 25 01:17:20.875133 containerd[1968]: time="2025-03-25T01:17:20.874481882Z" level=info msg="connecting to shim 3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0" address="unix:///run/containerd/s/53ba7a4ddaaacb4726b3bdbb39274c1d125b5bbf8783f466c0a4b7314b24c1bc" protocol=ttrpc version=3 Mar 25 01:17:20.919529 systemd[1]: Started cri-containerd-3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0.scope - libcontainer container 3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0. Mar 25 01:17:21.010850 containerd[1968]: time="2025-03-25T01:17:21.010155360Z" level=info msg="StartContainer for \"3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0\" returns successfully" Mar 25 01:17:21.886650 containerd[1968]: time="2025-03-25T01:17:21.886572295Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:17:21.891178 systemd[1]: cri-containerd-3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0.scope: Deactivated successfully. Mar 25 01:17:21.893821 systemd[1]: cri-containerd-3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0.scope: Consumed 872ms CPU time, 167.7M memory peak, 150.3M written to disk. Mar 25 01:17:21.894303 kubelet[3335]: I0325 01:17:21.894260 3335 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 25 01:17:21.903310 containerd[1968]: time="2025-03-25T01:17:21.901514600Z" level=info msg="received exit event container_id:\"3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0\" id:\"3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0\" pid:4261 exited_at:{seconds:1742865441 nanos:900148051}" Mar 25 01:17:21.903310 containerd[1968]: time="2025-03-25T01:17:21.901864237Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0\" id:\"3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0\" pid:4261 exited_at:{seconds:1742865441 nanos:900148051}" Mar 25 01:17:21.963791 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3b348e4b6eefe57ab866a8a9c1766c12a299a2a18224b2b35fd27a9dda996bd0-rootfs.mount: Deactivated successfully. Mar 25 01:17:21.982262 kubelet[3335]: I0325 01:17:21.980619 3335 topology_manager.go:215] "Topology Admit Handler" podUID="57a1d8fb-50e7-4d25-911d-c21ea0dc269c" podNamespace="calico-system" podName="calico-kube-controllers-564f5997bc-jhx5b" Mar 25 01:17:21.996497 kubelet[3335]: I0325 01:17:21.995211 3335 topology_manager.go:215] "Topology Admit Handler" podUID="d36c084d-ec62-4d75-89f3-585719967708" podNamespace="calico-apiserver" podName="calico-apiserver-5f598cd48-7hrsb" Mar 25 01:17:22.008007 kubelet[3335]: I0325 01:17:22.007763 3335 topology_manager.go:215] "Topology Admit Handler" podUID="3d0614c7-b26c-46a0-939e-ae9b1979b871" podNamespace="kube-system" podName="coredns-7db6d8ff4d-l2gdf" Mar 25 01:17:22.011179 kubelet[3335]: I0325 01:17:22.011110 3335 topology_manager.go:215] "Topology Admit Handler" podUID="12da84b8-1858-49eb-a378-e380d817f7ad" podNamespace="calico-apiserver" podName="calico-apiserver-5f598cd48-9mcjt" Mar 25 01:17:22.016568 kubelet[3335]: I0325 01:17:22.016491 3335 topology_manager.go:215] "Topology Admit Handler" podUID="3f4fe771-5062-4ca9-b1b0-866a84094468" podNamespace="kube-system" podName="coredns-7db6d8ff4d-jnw4t" Mar 25 01:17:22.028859 systemd[1]: Created slice kubepods-besteffort-pod57a1d8fb_50e7_4d25_911d_c21ea0dc269c.slice - libcontainer container kubepods-besteffort-pod57a1d8fb_50e7_4d25_911d_c21ea0dc269c.slice. Mar 25 01:17:22.062728 kubelet[3335]: I0325 01:17:22.057603 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxjls\" (UniqueName: \"kubernetes.io/projected/d36c084d-ec62-4d75-89f3-585719967708-kube-api-access-jxjls\") pod \"calico-apiserver-5f598cd48-7hrsb\" (UID: \"d36c084d-ec62-4d75-89f3-585719967708\") " pod="calico-apiserver/calico-apiserver-5f598cd48-7hrsb" Mar 25 01:17:22.062728 kubelet[3335]: I0325 01:17:22.057673 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d36c084d-ec62-4d75-89f3-585719967708-calico-apiserver-certs\") pod \"calico-apiserver-5f598cd48-7hrsb\" (UID: \"d36c084d-ec62-4d75-89f3-585719967708\") " pod="calico-apiserver/calico-apiserver-5f598cd48-7hrsb" Mar 25 01:17:22.062728 kubelet[3335]: I0325 01:17:22.057719 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/12da84b8-1858-49eb-a378-e380d817f7ad-calico-apiserver-certs\") pod \"calico-apiserver-5f598cd48-9mcjt\" (UID: \"12da84b8-1858-49eb-a378-e380d817f7ad\") " pod="calico-apiserver/calico-apiserver-5f598cd48-9mcjt" Mar 25 01:17:22.062728 kubelet[3335]: I0325 01:17:22.057755 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkx89\" (UniqueName: \"kubernetes.io/projected/12da84b8-1858-49eb-a378-e380d817f7ad-kube-api-access-rkx89\") pod \"calico-apiserver-5f598cd48-9mcjt\" (UID: \"12da84b8-1858-49eb-a378-e380d817f7ad\") " pod="calico-apiserver/calico-apiserver-5f598cd48-9mcjt" Mar 25 01:17:22.062728 kubelet[3335]: I0325 01:17:22.057793 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsxr\" (UniqueName: \"kubernetes.io/projected/57a1d8fb-50e7-4d25-911d-c21ea0dc269c-kube-api-access-wvsxr\") pod \"calico-kube-controllers-564f5997bc-jhx5b\" (UID: \"57a1d8fb-50e7-4d25-911d-c21ea0dc269c\") " pod="calico-system/calico-kube-controllers-564f5997bc-jhx5b" Mar 25 01:17:22.060654 systemd[1]: Created slice kubepods-besteffort-podd36c084d_ec62_4d75_89f3_585719967708.slice - libcontainer container kubepods-besteffort-podd36c084d_ec62_4d75_89f3_585719967708.slice. Mar 25 01:17:22.063217 kubelet[3335]: I0325 01:17:22.057835 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4fe771-5062-4ca9-b1b0-866a84094468-config-volume\") pod \"coredns-7db6d8ff4d-jnw4t\" (UID: \"3f4fe771-5062-4ca9-b1b0-866a84094468\") " pod="kube-system/coredns-7db6d8ff4d-jnw4t" Mar 25 01:17:22.063217 kubelet[3335]: I0325 01:17:22.057876 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57a1d8fb-50e7-4d25-911d-c21ea0dc269c-tigera-ca-bundle\") pod \"calico-kube-controllers-564f5997bc-jhx5b\" (UID: \"57a1d8fb-50e7-4d25-911d-c21ea0dc269c\") " pod="calico-system/calico-kube-controllers-564f5997bc-jhx5b" Mar 25 01:17:22.063217 kubelet[3335]: I0325 01:17:22.057917 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kkwt\" (UniqueName: \"kubernetes.io/projected/3f4fe771-5062-4ca9-b1b0-866a84094468-kube-api-access-6kkwt\") pod \"coredns-7db6d8ff4d-jnw4t\" (UID: \"3f4fe771-5062-4ca9-b1b0-866a84094468\") " pod="kube-system/coredns-7db6d8ff4d-jnw4t" Mar 25 01:17:22.063217 kubelet[3335]: I0325 01:17:22.057982 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d0614c7-b26c-46a0-939e-ae9b1979b871-config-volume\") pod \"coredns-7db6d8ff4d-l2gdf\" (UID: \"3d0614c7-b26c-46a0-939e-ae9b1979b871\") " pod="kube-system/coredns-7db6d8ff4d-l2gdf" Mar 25 01:17:22.063217 kubelet[3335]: I0325 01:17:22.058023 3335 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fvl\" (UniqueName: \"kubernetes.io/projected/3d0614c7-b26c-46a0-939e-ae9b1979b871-kube-api-access-22fvl\") pod \"coredns-7db6d8ff4d-l2gdf\" (UID: \"3d0614c7-b26c-46a0-939e-ae9b1979b871\") " pod="kube-system/coredns-7db6d8ff4d-l2gdf" Mar 25 01:17:22.091601 systemd[1]: Created slice kubepods-burstable-pod3d0614c7_b26c_46a0_939e_ae9b1979b871.slice - libcontainer container kubepods-burstable-pod3d0614c7_b26c_46a0_939e_ae9b1979b871.slice. Mar 25 01:17:22.126902 systemd[1]: Created slice kubepods-besteffort-pod12da84b8_1858_49eb_a378_e380d817f7ad.slice - libcontainer container kubepods-besteffort-pod12da84b8_1858_49eb_a378_e380d817f7ad.slice. Mar 25 01:17:22.131461 systemd[1]: Created slice kubepods-burstable-pod3f4fe771_5062_4ca9_b1b0_866a84094468.slice - libcontainer container kubepods-burstable-pod3f4fe771_5062_4ca9_b1b0_866a84094468.slice. Mar 25 01:17:22.153632 systemd[1]: Created slice kubepods-besteffort-pod292343f5_324d_4913_9d89_7e7c3530b2f1.slice - libcontainer container kubepods-besteffort-pod292343f5_324d_4913_9d89_7e7c3530b2f1.slice. Mar 25 01:17:22.164810 containerd[1968]: time="2025-03-25T01:17:22.163511525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rcpm6,Uid:292343f5-324d-4913-9d89-7e7c3530b2f1,Namespace:calico-system,Attempt:0,}" Mar 25 01:17:22.346729 containerd[1968]: time="2025-03-25T01:17:22.346628601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-564f5997bc-jhx5b,Uid:57a1d8fb-50e7-4d25-911d-c21ea0dc269c,Namespace:calico-system,Attempt:0,}" Mar 25 01:17:22.383877 containerd[1968]: time="2025-03-25T01:17:22.383811164Z" level=error msg="Failed to destroy network for sandbox \"94f62ca9e98252e0684c7cbf7ba68bca741727a76ab23dd782206315c23b99cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:22.384628 containerd[1968]: time="2025-03-25T01:17:22.384579248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-7hrsb,Uid:d36c084d-ec62-4d75-89f3-585719967708,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:17:22.404267 containerd[1968]: time="2025-03-25T01:17:22.403911345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-l2gdf,Uid:3d0614c7-b26c-46a0-939e-ae9b1979b871,Namespace:kube-system,Attempt:0,}" Mar 25 01:17:22.446140 containerd[1968]: time="2025-03-25T01:17:22.446024906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-9mcjt,Uid:12da84b8-1858-49eb-a378-e380d817f7ad,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:17:22.450036 containerd[1968]: time="2025-03-25T01:17:22.449983106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnw4t,Uid:3f4fe771-5062-4ca9-b1b0-866a84094468,Namespace:kube-system,Attempt:0,}" Mar 25 01:17:22.469603 containerd[1968]: time="2025-03-25T01:17:22.469502897Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rcpm6,Uid:292343f5-324d-4913-9d89-7e7c3530b2f1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"94f62ca9e98252e0684c7cbf7ba68bca741727a76ab23dd782206315c23b99cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:22.470403 kubelet[3335]: E0325 01:17:22.470141 3335 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94f62ca9e98252e0684c7cbf7ba68bca741727a76ab23dd782206315c23b99cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:22.470403 kubelet[3335]: E0325 01:17:22.470272 3335 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94f62ca9e98252e0684c7cbf7ba68bca741727a76ab23dd782206315c23b99cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rcpm6" Mar 25 01:17:22.470403 kubelet[3335]: E0325 01:17:22.470310 3335 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94f62ca9e98252e0684c7cbf7ba68bca741727a76ab23dd782206315c23b99cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rcpm6" Mar 25 01:17:22.471212 kubelet[3335]: E0325 01:17:22.470380 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rcpm6_calico-system(292343f5-324d-4913-9d89-7e7c3530b2f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rcpm6_calico-system(292343f5-324d-4913-9d89-7e7c3530b2f1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94f62ca9e98252e0684c7cbf7ba68bca741727a76ab23dd782206315c23b99cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rcpm6" podUID="292343f5-324d-4913-9d89-7e7c3530b2f1" Mar 25 01:17:22.755061 containerd[1968]: time="2025-03-25T01:17:22.754772860Z" level=error msg="Failed to destroy network for sandbox \"00259cbb656d198619c2855cb187109794eef7db04ccb9a33de6e2e541a086f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:22.905988 containerd[1968]: time="2025-03-25T01:17:22.905827371Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-564f5997bc-jhx5b,Uid:57a1d8fb-50e7-4d25-911d-c21ea0dc269c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"00259cbb656d198619c2855cb187109794eef7db04ccb9a33de6e2e541a086f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:22.907661 kubelet[3335]: E0325 01:17:22.907085 3335 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00259cbb656d198619c2855cb187109794eef7db04ccb9a33de6e2e541a086f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:22.907661 kubelet[3335]: E0325 01:17:22.907179 3335 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00259cbb656d198619c2855cb187109794eef7db04ccb9a33de6e2e541a086f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-564f5997bc-jhx5b" Mar 25 01:17:22.907661 kubelet[3335]: E0325 01:17:22.907244 3335 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00259cbb656d198619c2855cb187109794eef7db04ccb9a33de6e2e541a086f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-564f5997bc-jhx5b" Mar 25 01:17:22.908747 kubelet[3335]: E0325 01:17:22.907471 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-564f5997bc-jhx5b_calico-system(57a1d8fb-50e7-4d25-911d-c21ea0dc269c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-564f5997bc-jhx5b_calico-system(57a1d8fb-50e7-4d25-911d-c21ea0dc269c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"00259cbb656d198619c2855cb187109794eef7db04ccb9a33de6e2e541a086f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-564f5997bc-jhx5b" podUID="57a1d8fb-50e7-4d25-911d-c21ea0dc269c" Mar 25 01:17:22.978171 systemd[1]: run-netns-cni\x2db4bc5a4b\x2dbd62\x2d2c92\x2dd42f\x2d678ef23fd8e3.mount: Deactivated successfully. Mar 25 01:17:23.103904 containerd[1968]: time="2025-03-25T01:17:23.103627202Z" level=error msg="Failed to destroy network for sandbox \"4c0800ad932d31c2ebc28ec6025262654ec27877185102a4a8cf666e275c79a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.109702 systemd[1]: run-netns-cni\x2d592f3f04\x2d00d8\x2d092c\x2dc09f\x2d50927f0e762f.mount: Deactivated successfully. Mar 25 01:17:23.116769 containerd[1968]: time="2025-03-25T01:17:23.116694582Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnw4t,Uid:3f4fe771-5062-4ca9-b1b0-866a84094468,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c0800ad932d31c2ebc28ec6025262654ec27877185102a4a8cf666e275c79a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.117768 kubelet[3335]: E0325 01:17:23.117658 3335 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c0800ad932d31c2ebc28ec6025262654ec27877185102a4a8cf666e275c79a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.117768 kubelet[3335]: E0325 01:17:23.117744 3335 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c0800ad932d31c2ebc28ec6025262654ec27877185102a4a8cf666e275c79a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnw4t" Mar 25 01:17:23.117768 kubelet[3335]: E0325 01:17:23.117779 3335 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c0800ad932d31c2ebc28ec6025262654ec27877185102a4a8cf666e275c79a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnw4t" Mar 25 01:17:23.118525 kubelet[3335]: E0325 01:17:23.117857 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jnw4t_kube-system(3f4fe771-5062-4ca9-b1b0-866a84094468)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jnw4t_kube-system(3f4fe771-5062-4ca9-b1b0-866a84094468)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c0800ad932d31c2ebc28ec6025262654ec27877185102a4a8cf666e275c79a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jnw4t" podUID="3f4fe771-5062-4ca9-b1b0-866a84094468" Mar 25 01:17:23.145947 containerd[1968]: time="2025-03-25T01:17:23.145886598Z" level=error msg="Failed to destroy network for sandbox \"76ae78306df877e07c6381551efed6491364248ba69f01c4c5bb732934785004\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.148638 containerd[1968]: time="2025-03-25T01:17:23.148485470Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-l2gdf,Uid:3d0614c7-b26c-46a0-939e-ae9b1979b871,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"76ae78306df877e07c6381551efed6491364248ba69f01c4c5bb732934785004\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.152211 systemd[1]: run-netns-cni\x2df54c78e6\x2d26dd\x2df9a4\x2de07d\x2dd21e56254243.mount: Deactivated successfully. Mar 25 01:17:23.153599 kubelet[3335]: E0325 01:17:23.153551 3335 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76ae78306df877e07c6381551efed6491364248ba69f01c4c5bb732934785004\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.154944 kubelet[3335]: E0325 01:17:23.153804 3335 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76ae78306df877e07c6381551efed6491364248ba69f01c4c5bb732934785004\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-l2gdf" Mar 25 01:17:23.154944 kubelet[3335]: E0325 01:17:23.154330 3335 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76ae78306df877e07c6381551efed6491364248ba69f01c4c5bb732934785004\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-l2gdf" Mar 25 01:17:23.156302 kubelet[3335]: E0325 01:17:23.154439 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-l2gdf_kube-system(3d0614c7-b26c-46a0-939e-ae9b1979b871)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-l2gdf_kube-system(3d0614c7-b26c-46a0-939e-ae9b1979b871)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76ae78306df877e07c6381551efed6491364248ba69f01c4c5bb732934785004\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-l2gdf" podUID="3d0614c7-b26c-46a0-939e-ae9b1979b871" Mar 25 01:17:23.172517 containerd[1968]: time="2025-03-25T01:17:23.172437752Z" level=error msg="Failed to destroy network for sandbox \"883eb26d9840d36c80bff9a5e1c8d10d5acd0f436174ce1cea3180537adb1ab5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.176320 containerd[1968]: time="2025-03-25T01:17:23.174911647Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-7hrsb,Uid:d36c084d-ec62-4d75-89f3-585719967708,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"883eb26d9840d36c80bff9a5e1c8d10d5acd0f436174ce1cea3180537adb1ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.177138 kubelet[3335]: E0325 01:17:23.176930 3335 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"883eb26d9840d36c80bff9a5e1c8d10d5acd0f436174ce1cea3180537adb1ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.177138 kubelet[3335]: E0325 01:17:23.177020 3335 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"883eb26d9840d36c80bff9a5e1c8d10d5acd0f436174ce1cea3180537adb1ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f598cd48-7hrsb" Mar 25 01:17:23.177138 kubelet[3335]: E0325 01:17:23.177060 3335 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"883eb26d9840d36c80bff9a5e1c8d10d5acd0f436174ce1cea3180537adb1ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f598cd48-7hrsb" Mar 25 01:17:23.178427 kubelet[3335]: E0325 01:17:23.177141 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f598cd48-7hrsb_calico-apiserver(d36c084d-ec62-4d75-89f3-585719967708)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f598cd48-7hrsb_calico-apiserver(d36c084d-ec62-4d75-89f3-585719967708)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"883eb26d9840d36c80bff9a5e1c8d10d5acd0f436174ce1cea3180537adb1ab5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f598cd48-7hrsb" podUID="d36c084d-ec62-4d75-89f3-585719967708" Mar 25 01:17:23.180803 systemd[1]: run-netns-cni\x2de6ff1f66\x2db3ec\x2d259d\x2db2ba\x2d99a84235211f.mount: Deactivated successfully. Mar 25 01:17:23.185495 containerd[1968]: time="2025-03-25T01:17:23.185428106Z" level=error msg="Failed to destroy network for sandbox \"7c0798ffda2c34008627d7c169497832bb44fbc4489a3add0ce4410d2cae6435\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.188434 containerd[1968]: time="2025-03-25T01:17:23.188342612Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-9mcjt,Uid:12da84b8-1858-49eb-a378-e380d817f7ad,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c0798ffda2c34008627d7c169497832bb44fbc4489a3add0ce4410d2cae6435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.190554 kubelet[3335]: E0325 01:17:23.188687 3335 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c0798ffda2c34008627d7c169497832bb44fbc4489a3add0ce4410d2cae6435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:17:23.190554 kubelet[3335]: E0325 01:17:23.188765 3335 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c0798ffda2c34008627d7c169497832bb44fbc4489a3add0ce4410d2cae6435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f598cd48-9mcjt" Mar 25 01:17:23.190554 kubelet[3335]: E0325 01:17:23.188800 3335 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c0798ffda2c34008627d7c169497832bb44fbc4489a3add0ce4410d2cae6435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5f598cd48-9mcjt" Mar 25 01:17:23.190765 kubelet[3335]: E0325 01:17:23.188863 3335 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5f598cd48-9mcjt_calico-apiserver(12da84b8-1858-49eb-a378-e380d817f7ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5f598cd48-9mcjt_calico-apiserver(12da84b8-1858-49eb-a378-e380d817f7ad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7c0798ffda2c34008627d7c169497832bb44fbc4489a3add0ce4410d2cae6435\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5f598cd48-9mcjt" podUID="12da84b8-1858-49eb-a378-e380d817f7ad" Mar 25 01:17:23.191697 systemd[1]: run-netns-cni\x2d32985ec5\x2d2786\x2dd063\x2da6fd\x2d4d497b63dce2.mount: Deactivated successfully. Mar 25 01:17:23.295138 containerd[1968]: time="2025-03-25T01:17:23.293644583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:17:25.431892 systemd[1]: Started sshd@7-172.31.24.136:22-147.75.109.163:54284.service - OpenSSH per-connection server daemon (147.75.109.163:54284). Mar 25 01:17:25.649837 sshd[4482]: Accepted publickey for core from 147.75.109.163 port 54284 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:25.652997 sshd-session[4482]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:25.664659 systemd-logind[1939]: New session 8 of user core. Mar 25 01:17:25.671623 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:17:25.978664 sshd[4484]: Connection closed by 147.75.109.163 port 54284 Mar 25 01:17:25.979025 sshd-session[4482]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:25.988972 systemd[1]: sshd@7-172.31.24.136:22-147.75.109.163:54284.service: Deactivated successfully. Mar 25 01:17:25.995733 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:17:25.999434 systemd-logind[1939]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:17:26.002176 systemd-logind[1939]: Removed session 8. Mar 25 01:17:29.252689 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3456907260.mount: Deactivated successfully. Mar 25 01:17:29.312699 containerd[1968]: time="2025-03-25T01:17:29.312547144Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:29.314193 containerd[1968]: time="2025-03-25T01:17:29.314115983Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 25 01:17:29.315087 containerd[1968]: time="2025-03-25T01:17:29.314991713Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:29.318561 containerd[1968]: time="2025-03-25T01:17:29.318457403Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:29.319820 containerd[1968]: time="2025-03-25T01:17:29.319763262Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 6.025985342s" Mar 25 01:17:29.319820 containerd[1968]: time="2025-03-25T01:17:29.319823916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 25 01:17:29.348950 containerd[1968]: time="2025-03-25T01:17:29.348704424Z" level=info msg="CreateContainer within sandbox \"9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:17:29.366253 containerd[1968]: time="2025-03-25T01:17:29.366106087Z" level=info msg="Container 33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:29.381791 containerd[1968]: time="2025-03-25T01:17:29.381713064Z" level=info msg="CreateContainer within sandbox \"9858a009795816469fbf779e9de43b6286d0a5f5061567309a6e95cdd32bc12b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a\"" Mar 25 01:17:29.383650 containerd[1968]: time="2025-03-25T01:17:29.383582437Z" level=info msg="StartContainer for \"33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a\"" Mar 25 01:17:29.386764 containerd[1968]: time="2025-03-25T01:17:29.386630412Z" level=info msg="connecting to shim 33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a" address="unix:///run/containerd/s/53ba7a4ddaaacb4726b3bdbb39274c1d125b5bbf8783f466c0a4b7314b24c1bc" protocol=ttrpc version=3 Mar 25 01:17:29.458412 systemd[1]: Started cri-containerd-33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a.scope - libcontainer container 33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a. Mar 25 01:17:29.574156 containerd[1968]: time="2025-03-25T01:17:29.574007277Z" level=info msg="StartContainer for \"33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a\" returns successfully" Mar 25 01:17:29.689623 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:17:29.689785 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:17:30.349269 kubelet[3335]: I0325 01:17:30.348353 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-z69gs" podStartSLOduration=1.9572994000000001 podStartE2EDuration="18.348327902s" podCreationTimestamp="2025-03-25 01:17:12 +0000 UTC" firstStartedPulling="2025-03-25 01:17:12.930683029 +0000 UTC m=+28.112113693" lastFinishedPulling="2025-03-25 01:17:29.321711532 +0000 UTC m=+44.503142195" observedRunningTime="2025-03-25 01:17:30.345805276 +0000 UTC m=+45.527235940" watchObservedRunningTime="2025-03-25 01:17:30.348327902 +0000 UTC m=+45.529758566" Mar 25 01:17:31.014918 systemd[1]: Started sshd@8-172.31.24.136:22-147.75.109.163:36860.service - OpenSSH per-connection server daemon (147.75.109.163:36860). Mar 25 01:17:31.226977 sshd[4565]: Accepted publickey for core from 147.75.109.163 port 36860 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:31.231092 sshd-session[4565]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:31.248475 systemd-logind[1939]: New session 9 of user core. Mar 25 01:17:31.261294 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:17:31.632752 sshd[4595]: Connection closed by 147.75.109.163 port 36860 Mar 25 01:17:31.638334 sshd-session[4565]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:31.649052 systemd[1]: sshd@8-172.31.24.136:22-147.75.109.163:36860.service: Deactivated successfully. Mar 25 01:17:31.659482 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:17:31.667381 systemd-logind[1939]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:17:31.670754 systemd-logind[1939]: Removed session 9. Mar 25 01:17:31.992333 kernel: bpftool[4710]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:17:32.343520 (udev-worker)[4539]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:17:32.346724 systemd-networkd[1869]: vxlan.calico: Link UP Mar 25 01:17:32.346745 systemd-networkd[1869]: vxlan.calico: Gained carrier Mar 25 01:17:32.387901 (udev-worker)[4543]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:17:33.507624 systemd-networkd[1869]: vxlan.calico: Gained IPv6LL Mar 25 01:17:34.065095 containerd[1968]: time="2025-03-25T01:17:34.064769910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rcpm6,Uid:292343f5-324d-4913-9d89-7e7c3530b2f1,Namespace:calico-system,Attempt:0,}" Mar 25 01:17:34.065095 containerd[1968]: time="2025-03-25T01:17:34.064871271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-9mcjt,Uid:12da84b8-1858-49eb-a378-e380d817f7ad,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:17:34.413714 systemd-networkd[1869]: calic2adb87bd1a: Link UP Mar 25 01:17:34.416465 systemd-networkd[1869]: calic2adb87bd1a: Gained carrier Mar 25 01:17:34.439142 containerd[1968]: 2025-03-25 01:17:34.202 [INFO][4779] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0 csi-node-driver- calico-system 292343f5-324d-4913-9d89-7e7c3530b2f1 642 0 2025-03-25 01:17:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-24-136 csi-node-driver-rcpm6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic2adb87bd1a [] []}} ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-" Mar 25 01:17:34.439142 containerd[1968]: 2025-03-25 01:17:34.203 [INFO][4779] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" Mar 25 01:17:34.439142 containerd[1968]: 2025-03-25 01:17:34.291 [INFO][4804] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" HandleID="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Workload="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.312 [INFO][4804] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" HandleID="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Workload="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000319bb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-24-136", "pod":"csi-node-driver-rcpm6", "timestamp":"2025-03-25 01:17:34.29108706 +0000 UTC"}, Hostname:"ip-172-31-24-136", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.312 [INFO][4804] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.313 [INFO][4804] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.313 [INFO][4804] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-24-136' Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.316 [INFO][4804] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" host="ip-172-31-24-136" Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.325 [INFO][4804] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-24-136" Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.332 [INFO][4804] ipam/ipam.go 489: Trying affinity for 192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.335 [INFO][4804] ipam/ipam.go 155: Attempting to load block cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:34.439483 containerd[1968]: 2025-03-25 01:17:34.340 [INFO][4804] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:34.440861 containerd[1968]: 2025-03-25 01:17:34.340 [INFO][4804] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.105.64/26 handle="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" host="ip-172-31-24-136" Mar 25 01:17:34.440861 containerd[1968]: 2025-03-25 01:17:34.342 [INFO][4804] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c Mar 25 01:17:34.440861 containerd[1968]: 2025-03-25 01:17:34.350 [INFO][4804] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.105.64/26 handle="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" host="ip-172-31-24-136" Mar 25 01:17:34.440861 containerd[1968]: 2025-03-25 01:17:34.362 [INFO][4804] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.105.65/26] block=192.168.105.64/26 handle="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" host="ip-172-31-24-136" Mar 25 01:17:34.440861 containerd[1968]: 2025-03-25 01:17:34.362 [INFO][4804] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.105.65/26] handle="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" host="ip-172-31-24-136" Mar 25 01:17:34.440861 containerd[1968]: 2025-03-25 01:17:34.362 [INFO][4804] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:17:34.440861 containerd[1968]: 2025-03-25 01:17:34.362 [INFO][4804] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.65/26] IPv6=[] ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" HandleID="k8s-pod-network.7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Workload="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" Mar 25 01:17:34.441683 containerd[1968]: 2025-03-25 01:17:34.372 [INFO][4779] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"292343f5-324d-4913-9d89-7e7c3530b2f1", ResourceVersion:"642", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"", Pod:"csi-node-driver-rcpm6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic2adb87bd1a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:34.441841 containerd[1968]: 2025-03-25 01:17:34.372 [INFO][4779] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.105.65/32] ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" Mar 25 01:17:34.441841 containerd[1968]: 2025-03-25 01:17:34.372 [INFO][4779] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic2adb87bd1a ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" Mar 25 01:17:34.441841 containerd[1968]: 2025-03-25 01:17:34.405 [INFO][4779] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" Mar 25 01:17:34.443318 containerd[1968]: 2025-03-25 01:17:34.408 [INFO][4779] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"292343f5-324d-4913-9d89-7e7c3530b2f1", ResourceVersion:"642", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c", Pod:"csi-node-driver-rcpm6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic2adb87bd1a", MAC:"7e:c6:c3:71:c8:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:34.443567 containerd[1968]: 2025-03-25 01:17:34.431 [INFO][4779] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" Namespace="calico-system" Pod="csi-node-driver-rcpm6" WorkloadEndpoint="ip--172--31--24--136-k8s-csi--node--driver--rcpm6-eth0" Mar 25 01:17:34.486369 systemd-networkd[1869]: cali730be3dbcb0: Link UP Mar 25 01:17:34.488994 systemd-networkd[1869]: cali730be3dbcb0: Gained carrier Mar 25 01:17:34.552106 containerd[1968]: 2025-03-25 01:17:34.202 [INFO][4783] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0 calico-apiserver-5f598cd48- calico-apiserver 12da84b8-1858-49eb-a378-e380d817f7ad 728 0 2025-03-25 01:17:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f598cd48 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-24-136 calico-apiserver-5f598cd48-9mcjt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali730be3dbcb0 [] []}} ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-" Mar 25 01:17:34.552106 containerd[1968]: 2025-03-25 01:17:34.202 [INFO][4783] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" Mar 25 01:17:34.552106 containerd[1968]: 2025-03-25 01:17:34.288 [INFO][4802] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" HandleID="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Workload="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.319 [INFO][4802] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" HandleID="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Workload="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400046ea80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-24-136", "pod":"calico-apiserver-5f598cd48-9mcjt", "timestamp":"2025-03-25 01:17:34.288946646 +0000 UTC"}, Hostname:"ip-172-31-24-136", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.319 [INFO][4802] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.365 [INFO][4802] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.365 [INFO][4802] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-24-136' Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.369 [INFO][4802] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" host="ip-172-31-24-136" Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.393 [INFO][4802] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-24-136" Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.408 [INFO][4802] ipam/ipam.go 489: Trying affinity for 192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.420 [INFO][4802] ipam/ipam.go 155: Attempting to load block cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:34.552481 containerd[1968]: 2025-03-25 01:17:34.433 [INFO][4802] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:34.554585 containerd[1968]: 2025-03-25 01:17:34.433 [INFO][4802] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.105.64/26 handle="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" host="ip-172-31-24-136" Mar 25 01:17:34.554585 containerd[1968]: 2025-03-25 01:17:34.444 [INFO][4802] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998 Mar 25 01:17:34.554585 containerd[1968]: 2025-03-25 01:17:34.455 [INFO][4802] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.105.64/26 handle="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" host="ip-172-31-24-136" Mar 25 01:17:34.554585 containerd[1968]: 2025-03-25 01:17:34.472 [INFO][4802] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.105.66/26] block=192.168.105.64/26 handle="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" host="ip-172-31-24-136" Mar 25 01:17:34.554585 containerd[1968]: 2025-03-25 01:17:34.472 [INFO][4802] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.105.66/26] handle="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" host="ip-172-31-24-136" Mar 25 01:17:34.554585 containerd[1968]: 2025-03-25 01:17:34.472 [INFO][4802] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:17:34.554585 containerd[1968]: 2025-03-25 01:17:34.472 [INFO][4802] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.66/26] IPv6=[] ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" HandleID="k8s-pod-network.7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Workload="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" Mar 25 01:17:34.554939 containerd[1968]: 2025-03-25 01:17:34.476 [INFO][4783] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0", GenerateName:"calico-apiserver-5f598cd48-", Namespace:"calico-apiserver", SelfLink:"", UID:"12da84b8-1858-49eb-a378-e380d817f7ad", ResourceVersion:"728", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f598cd48", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"", Pod:"calico-apiserver-5f598cd48-9mcjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali730be3dbcb0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:34.555099 containerd[1968]: 2025-03-25 01:17:34.476 [INFO][4783] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.105.66/32] ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" Mar 25 01:17:34.555099 containerd[1968]: 2025-03-25 01:17:34.476 [INFO][4783] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali730be3dbcb0 ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" Mar 25 01:17:34.555099 containerd[1968]: 2025-03-25 01:17:34.490 [INFO][4783] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" Mar 25 01:17:34.555308 containerd[1968]: 2025-03-25 01:17:34.493 [INFO][4783] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0", GenerateName:"calico-apiserver-5f598cd48-", Namespace:"calico-apiserver", SelfLink:"", UID:"12da84b8-1858-49eb-a378-e380d817f7ad", ResourceVersion:"728", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f598cd48", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998", Pod:"calico-apiserver-5f598cd48-9mcjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali730be3dbcb0", MAC:"22:33:39:cc:80:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:34.555443 containerd[1968]: 2025-03-25 01:17:34.546 [INFO][4783] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-9mcjt" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--9mcjt-eth0" Mar 25 01:17:34.613688 containerd[1968]: time="2025-03-25T01:17:34.613583076Z" level=info msg="connecting to shim 7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c" address="unix:///run/containerd/s/79fcdcd104765b9ebc545f07f6cd785a05c20f8909d9b0e0bcb54500d979ad7a" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:34.624735 containerd[1968]: time="2025-03-25T01:17:34.624643535Z" level=info msg="connecting to shim 7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998" address="unix:///run/containerd/s/737d753615999a91fa029fb578ae0d6d902f315b5323a1a374023ddcf498a934" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:34.683557 systemd[1]: Started cri-containerd-7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998.scope - libcontainer container 7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998. Mar 25 01:17:34.693341 systemd[1]: Started cri-containerd-7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c.scope - libcontainer container 7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c. Mar 25 01:17:34.774284 containerd[1968]: time="2025-03-25T01:17:34.774145807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rcpm6,Uid:292343f5-324d-4913-9d89-7e7c3530b2f1,Namespace:calico-system,Attempt:0,} returns sandbox id \"7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c\"" Mar 25 01:17:34.788895 containerd[1968]: time="2025-03-25T01:17:34.788390553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-9mcjt,Uid:12da84b8-1858-49eb-a378-e380d817f7ad,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998\"" Mar 25 01:17:34.798829 containerd[1968]: time="2025-03-25T01:17:34.798766418Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:17:35.491800 systemd-networkd[1869]: calic2adb87bd1a: Gained IPv6LL Mar 25 01:17:36.065263 containerd[1968]: time="2025-03-25T01:17:36.065169440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-564f5997bc-jhx5b,Uid:57a1d8fb-50e7-4d25-911d-c21ea0dc269c,Namespace:calico-system,Attempt:0,}" Mar 25 01:17:36.247024 containerd[1968]: time="2025-03-25T01:17:36.246775548Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:36.250170 containerd[1968]: time="2025-03-25T01:17:36.249988501Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 25 01:17:36.253569 containerd[1968]: time="2025-03-25T01:17:36.253409285Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:36.260903 containerd[1968]: time="2025-03-25T01:17:36.260316306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:36.262458 containerd[1968]: time="2025-03-25T01:17:36.262389829Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.463557612s" Mar 25 01:17:36.262458 containerd[1968]: time="2025-03-25T01:17:36.262453241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 25 01:17:36.279337 containerd[1968]: time="2025-03-25T01:17:36.279279744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:17:36.285603 containerd[1968]: time="2025-03-25T01:17:36.284016618Z" level=info msg="CreateContainer within sandbox \"7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:17:36.333615 containerd[1968]: time="2025-03-25T01:17:36.330531209Z" level=info msg="Container 6f9cf5bc87bdd18236543be564f0167bfcd41d4aa8c76cf96982f540df486a34: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:36.370075 containerd[1968]: time="2025-03-25T01:17:36.369965059Z" level=info msg="CreateContainer within sandbox \"7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"6f9cf5bc87bdd18236543be564f0167bfcd41d4aa8c76cf96982f540df486a34\"" Mar 25 01:17:36.372665 containerd[1968]: time="2025-03-25T01:17:36.372594995Z" level=info msg="StartContainer for \"6f9cf5bc87bdd18236543be564f0167bfcd41d4aa8c76cf96982f540df486a34\"" Mar 25 01:17:36.377995 containerd[1968]: time="2025-03-25T01:17:36.377921842Z" level=info msg="connecting to shim 6f9cf5bc87bdd18236543be564f0167bfcd41d4aa8c76cf96982f540df486a34" address="unix:///run/containerd/s/79fcdcd104765b9ebc545f07f6cd785a05c20f8909d9b0e0bcb54500d979ad7a" protocol=ttrpc version=3 Mar 25 01:17:36.416855 systemd-networkd[1869]: cali76c03659801: Link UP Mar 25 01:17:36.419829 systemd-networkd[1869]: cali76c03659801: Gained carrier Mar 25 01:17:36.443639 systemd[1]: Started cri-containerd-6f9cf5bc87bdd18236543be564f0167bfcd41d4aa8c76cf96982f540df486a34.scope - libcontainer container 6f9cf5bc87bdd18236543be564f0167bfcd41d4aa8c76cf96982f540df486a34. Mar 25 01:17:36.456642 kubelet[3335]: I0325 01:17:36.451954 3335 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:17:36.470517 containerd[1968]: 2025-03-25 01:17:36.178 [INFO][4938] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0 calico-kube-controllers-564f5997bc- calico-system 57a1d8fb-50e7-4d25-911d-c21ea0dc269c 721 0 2025-03-25 01:17:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:564f5997bc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-24-136 calico-kube-controllers-564f5997bc-jhx5b eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali76c03659801 [] []}} ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-" Mar 25 01:17:36.470517 containerd[1968]: 2025-03-25 01:17:36.178 [INFO][4938] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" Mar 25 01:17:36.470517 containerd[1968]: 2025-03-25 01:17:36.271 [INFO][4950] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" HandleID="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Workload="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.305 [INFO][4950] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" HandleID="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Workload="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000262110), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-24-136", "pod":"calico-kube-controllers-564f5997bc-jhx5b", "timestamp":"2025-03-25 01:17:36.271599566 +0000 UTC"}, Hostname:"ip-172-31-24-136", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.305 [INFO][4950] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.305 [INFO][4950] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.306 [INFO][4950] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-24-136' Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.313 [INFO][4950] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" host="ip-172-31-24-136" Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.326 [INFO][4950] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-24-136" Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.347 [INFO][4950] ipam/ipam.go 489: Trying affinity for 192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.352 [INFO][4950] ipam/ipam.go 155: Attempting to load block cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:36.471314 containerd[1968]: 2025-03-25 01:17:36.362 [INFO][4950] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:36.473993 containerd[1968]: 2025-03-25 01:17:36.362 [INFO][4950] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.105.64/26 handle="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" host="ip-172-31-24-136" Mar 25 01:17:36.473993 containerd[1968]: 2025-03-25 01:17:36.366 [INFO][4950] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9 Mar 25 01:17:36.473993 containerd[1968]: 2025-03-25 01:17:36.377 [INFO][4950] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.105.64/26 handle="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" host="ip-172-31-24-136" Mar 25 01:17:36.473993 containerd[1968]: 2025-03-25 01:17:36.403 [INFO][4950] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.105.67/26] block=192.168.105.64/26 handle="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" host="ip-172-31-24-136" Mar 25 01:17:36.473993 containerd[1968]: 2025-03-25 01:17:36.404 [INFO][4950] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.105.67/26] handle="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" host="ip-172-31-24-136" Mar 25 01:17:36.473993 containerd[1968]: 2025-03-25 01:17:36.404 [INFO][4950] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:17:36.473993 containerd[1968]: 2025-03-25 01:17:36.404 [INFO][4950] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.67/26] IPv6=[] ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" HandleID="k8s-pod-network.3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Workload="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" Mar 25 01:17:36.475384 containerd[1968]: 2025-03-25 01:17:36.409 [INFO][4938] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0", GenerateName:"calico-kube-controllers-564f5997bc-", Namespace:"calico-system", SelfLink:"", UID:"57a1d8fb-50e7-4d25-911d-c21ea0dc269c", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"564f5997bc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"", Pod:"calico-kube-controllers-564f5997bc-jhx5b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali76c03659801", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:36.476089 containerd[1968]: 2025-03-25 01:17:36.409 [INFO][4938] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.105.67/32] ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" Mar 25 01:17:36.476089 containerd[1968]: 2025-03-25 01:17:36.409 [INFO][4938] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76c03659801 ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" Mar 25 01:17:36.476089 containerd[1968]: 2025-03-25 01:17:36.415 [INFO][4938] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" Mar 25 01:17:36.477046 containerd[1968]: 2025-03-25 01:17:36.417 [INFO][4938] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0", GenerateName:"calico-kube-controllers-564f5997bc-", Namespace:"calico-system", SelfLink:"", UID:"57a1d8fb-50e7-4d25-911d-c21ea0dc269c", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"564f5997bc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9", Pod:"calico-kube-controllers-564f5997bc-jhx5b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali76c03659801", MAC:"26:82:89:0f:d0:56", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:36.477202 containerd[1968]: 2025-03-25 01:17:36.440 [INFO][4938] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" Namespace="calico-system" Pod="calico-kube-controllers-564f5997bc-jhx5b" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--kube--controllers--564f5997bc--jhx5b-eth0" Mar 25 01:17:36.517097 systemd-networkd[1869]: cali730be3dbcb0: Gained IPv6LL Mar 25 01:17:36.567832 containerd[1968]: time="2025-03-25T01:17:36.567749166Z" level=info msg="connecting to shim 3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9" address="unix:///run/containerd/s/7559c58145f607100b57a1c0aaf1087b3c4b1f03ad9e824e9c6bc88ebf1e09c0" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:36.639345 systemd[1]: Started cri-containerd-3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9.scope - libcontainer container 3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9. Mar 25 01:17:36.671417 systemd[1]: Started sshd@9-172.31.24.136:22-147.75.109.163:36868.service - OpenSSH per-connection server daemon (147.75.109.163:36868). Mar 25 01:17:36.801283 containerd[1968]: time="2025-03-25T01:17:36.801101360Z" level=info msg="StartContainer for \"6f9cf5bc87bdd18236543be564f0167bfcd41d4aa8c76cf96982f540df486a34\" returns successfully" Mar 25 01:17:36.831469 containerd[1968]: time="2025-03-25T01:17:36.830828573Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a\" id:\"37d0d83a5d5add004f0a3305ff3a11aef7d7f31f9de51d5f258e323ccdf81811\" pid:4998 exited_at:{seconds:1742865456 nanos:830328807}" Mar 25 01:17:36.869952 containerd[1968]: time="2025-03-25T01:17:36.868928330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-564f5997bc-jhx5b,Uid:57a1d8fb-50e7-4d25-911d-c21ea0dc269c,Namespace:calico-system,Attempt:0,} returns sandbox id \"3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9\"" Mar 25 01:17:36.926364 sshd[5044]: Accepted publickey for core from 147.75.109.163 port 36868 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:36.929773 sshd-session[5044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:36.940571 systemd-logind[1939]: New session 10 of user core. Mar 25 01:17:36.947538 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:17:36.991660 containerd[1968]: time="2025-03-25T01:17:36.991533721Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a\" id:\"5904e11a49ae06f5bc76357702c5a7e0e7e16a30defa66a25038d22b829f0d7c\" pid:5086 exited_at:{seconds:1742865456 nanos:991162159}" Mar 25 01:17:37.067353 containerd[1968]: time="2025-03-25T01:17:37.066596167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-7hrsb,Uid:d36c084d-ec62-4d75-89f3-585719967708,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:17:37.070064 containerd[1968]: time="2025-03-25T01:17:37.069715974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-l2gdf,Uid:3d0614c7-b26c-46a0-939e-ae9b1979b871,Namespace:kube-system,Attempt:0,}" Mar 25 01:17:37.319462 sshd[5096]: Connection closed by 147.75.109.163 port 36868 Mar 25 01:17:37.322243 sshd-session[5044]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:37.332405 systemd[1]: sshd@9-172.31.24.136:22-147.75.109.163:36868.service: Deactivated successfully. Mar 25 01:17:37.343435 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:17:37.348018 systemd-logind[1939]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:17:37.375932 systemd[1]: Started sshd@10-172.31.24.136:22-147.75.109.163:36874.service - OpenSSH per-connection server daemon (147.75.109.163:36874). Mar 25 01:17:37.378197 systemd-logind[1939]: Removed session 10. Mar 25 01:17:37.483156 systemd-networkd[1869]: cali4f79511a805: Link UP Mar 25 01:17:37.486652 systemd-networkd[1869]: cali4f79511a805: Gained carrier Mar 25 01:17:37.546503 containerd[1968]: 2025-03-25 01:17:37.245 [INFO][5110] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0 coredns-7db6d8ff4d- kube-system 3d0614c7-b26c-46a0-939e-ae9b1979b871 729 0 2025-03-25 01:17:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-24-136 coredns-7db6d8ff4d-l2gdf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4f79511a805 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-" Mar 25 01:17:37.546503 containerd[1968]: 2025-03-25 01:17:37.247 [INFO][5110] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" Mar 25 01:17:37.546503 containerd[1968]: 2025-03-25 01:17:37.386 [INFO][5134] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" HandleID="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Workload="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.410 [INFO][5134] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" HandleID="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Workload="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000281df0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-24-136", "pod":"coredns-7db6d8ff4d-l2gdf", "timestamp":"2025-03-25 01:17:37.386111362 +0000 UTC"}, Hostname:"ip-172-31-24-136", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.410 [INFO][5134] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.410 [INFO][5134] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.410 [INFO][5134] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-24-136' Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.412 [INFO][5134] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" host="ip-172-31-24-136" Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.421 [INFO][5134] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-24-136" Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.434 [INFO][5134] ipam/ipam.go 489: Trying affinity for 192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.439 [INFO][5134] ipam/ipam.go 155: Attempting to load block cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:37.546942 containerd[1968]: 2025-03-25 01:17:37.445 [INFO][5134] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:37.550391 containerd[1968]: 2025-03-25 01:17:37.445 [INFO][5134] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.105.64/26 handle="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" host="ip-172-31-24-136" Mar 25 01:17:37.550391 containerd[1968]: 2025-03-25 01:17:37.453 [INFO][5134] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d Mar 25 01:17:37.550391 containerd[1968]: 2025-03-25 01:17:37.463 [INFO][5134] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.105.64/26 handle="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" host="ip-172-31-24-136" Mar 25 01:17:37.550391 containerd[1968]: 2025-03-25 01:17:37.472 [INFO][5134] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.105.68/26] block=192.168.105.64/26 handle="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" host="ip-172-31-24-136" Mar 25 01:17:37.550391 containerd[1968]: 2025-03-25 01:17:37.472 [INFO][5134] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.105.68/26] handle="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" host="ip-172-31-24-136" Mar 25 01:17:37.550391 containerd[1968]: 2025-03-25 01:17:37.472 [INFO][5134] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:17:37.550391 containerd[1968]: 2025-03-25 01:17:37.472 [INFO][5134] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.68/26] IPv6=[] ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" HandleID="k8s-pod-network.3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Workload="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" Mar 25 01:17:37.550872 containerd[1968]: 2025-03-25 01:17:37.477 [INFO][5110] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3d0614c7-b26c-46a0-939e-ae9b1979b871", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"", Pod:"coredns-7db6d8ff4d-l2gdf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4f79511a805", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:37.551027 containerd[1968]: 2025-03-25 01:17:37.478 [INFO][5110] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.105.68/32] ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" Mar 25 01:17:37.551027 containerd[1968]: 2025-03-25 01:17:37.478 [INFO][5110] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4f79511a805 ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" Mar 25 01:17:37.551027 containerd[1968]: 2025-03-25 01:17:37.487 [INFO][5110] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" Mar 25 01:17:37.551188 containerd[1968]: 2025-03-25 01:17:37.488 [INFO][5110] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3d0614c7-b26c-46a0-939e-ae9b1979b871", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d", Pod:"coredns-7db6d8ff4d-l2gdf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4f79511a805", MAC:"ea:d1:e5:c4:f2:a7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:37.551188 containerd[1968]: 2025-03-25 01:17:37.539 [INFO][5110] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-l2gdf" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--l2gdf-eth0" Mar 25 01:17:37.599077 systemd-networkd[1869]: cali32091b6f5c4: Link UP Mar 25 01:17:37.599489 systemd-networkd[1869]: cali32091b6f5c4: Gained carrier Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.269 [INFO][5107] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0 calico-apiserver-5f598cd48- calico-apiserver d36c084d-ec62-4d75-89f3-585719967708 725 0 2025-03-25 01:17:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5f598cd48 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-24-136 calico-apiserver-5f598cd48-7hrsb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali32091b6f5c4 [] []}} ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.269 [INFO][5107] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.415 [INFO][5139] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" HandleID="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Workload="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.459 [INFO][5139] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" HandleID="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Workload="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005866f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-24-136", "pod":"calico-apiserver-5f598cd48-7hrsb", "timestamp":"2025-03-25 01:17:37.415314738 +0000 UTC"}, Hostname:"ip-172-31-24-136", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.459 [INFO][5139] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.472 [INFO][5139] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.473 [INFO][5139] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-24-136' Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.477 [INFO][5139] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.500 [INFO][5139] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.517 [INFO][5139] ipam/ipam.go 489: Trying affinity for 192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.538 [INFO][5139] ipam/ipam.go 155: Attempting to load block cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.543 [INFO][5139] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.543 [INFO][5139] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.105.64/26 handle="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.552 [INFO][5139] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967 Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.569 [INFO][5139] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.105.64/26 handle="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.583 [INFO][5139] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.105.69/26] block=192.168.105.64/26 handle="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.584 [INFO][5139] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.105.69/26] handle="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" host="ip-172-31-24-136" Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.584 [INFO][5139] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:17:37.649889 containerd[1968]: 2025-03-25 01:17:37.584 [INFO][5139] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.69/26] IPv6=[] ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" HandleID="k8s-pod-network.5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Workload="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" Mar 25 01:17:37.652004 containerd[1968]: 2025-03-25 01:17:37.591 [INFO][5107] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0", GenerateName:"calico-apiserver-5f598cd48-", Namespace:"calico-apiserver", SelfLink:"", UID:"d36c084d-ec62-4d75-89f3-585719967708", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f598cd48", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"", Pod:"calico-apiserver-5f598cd48-7hrsb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali32091b6f5c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:37.652004 containerd[1968]: 2025-03-25 01:17:37.591 [INFO][5107] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.105.69/32] ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" Mar 25 01:17:37.652004 containerd[1968]: 2025-03-25 01:17:37.591 [INFO][5107] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali32091b6f5c4 ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" Mar 25 01:17:37.652004 containerd[1968]: 2025-03-25 01:17:37.598 [INFO][5107] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" Mar 25 01:17:37.652004 containerd[1968]: 2025-03-25 01:17:37.601 [INFO][5107] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0", GenerateName:"calico-apiserver-5f598cd48-", Namespace:"calico-apiserver", SelfLink:"", UID:"d36c084d-ec62-4d75-89f3-585719967708", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5f598cd48", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967", Pod:"calico-apiserver-5f598cd48-7hrsb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali32091b6f5c4", MAC:"36:04:b3:6c:69:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:37.652004 containerd[1968]: 2025-03-25 01:17:37.643 [INFO][5107] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" Namespace="calico-apiserver" Pod="calico-apiserver-5f598cd48-7hrsb" WorkloadEndpoint="ip--172--31--24--136-k8s-calico--apiserver--5f598cd48--7hrsb-eth0" Mar 25 01:17:37.660000 containerd[1968]: time="2025-03-25T01:17:37.659730197Z" level=info msg="connecting to shim 3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d" address="unix:///run/containerd/s/cadfad4a27ad591ac5904a6c18cf9c858fb979728a8b9b0bc3719062d399466d" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:37.666653 sshd[5151]: Accepted publickey for core from 147.75.109.163 port 36874 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:37.671319 sshd-session[5151]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:37.695074 systemd-logind[1939]: New session 11 of user core. Mar 25 01:17:37.701726 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:17:37.766720 systemd[1]: Started cri-containerd-3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d.scope - libcontainer container 3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d. Mar 25 01:17:37.790685 containerd[1968]: time="2025-03-25T01:17:37.789128126Z" level=info msg="connecting to shim 5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967" address="unix:///run/containerd/s/34ad9d3aefe1d89e2ff2c29146094b9a1f9f7a732ebeaacccd14737c6f7e8aef" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:37.860520 systemd-networkd[1869]: cali76c03659801: Gained IPv6LL Mar 25 01:17:37.895429 systemd[1]: Started cri-containerd-5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967.scope - libcontainer container 5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967. Mar 25 01:17:37.988533 containerd[1968]: time="2025-03-25T01:17:37.988280437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-l2gdf,Uid:3d0614c7-b26c-46a0-939e-ae9b1979b871,Namespace:kube-system,Attempt:0,} returns sandbox id \"3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d\"" Mar 25 01:17:38.010773 containerd[1968]: time="2025-03-25T01:17:38.010713451Z" level=info msg="CreateContainer within sandbox \"3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:17:38.059317 containerd[1968]: time="2025-03-25T01:17:38.059134381Z" level=info msg="Container e8e50a5cd4968e7c3656035f17c59c91d664a285acfdb7ae4e6e814428060621: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:38.068239 containerd[1968]: time="2025-03-25T01:17:38.067896285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnw4t,Uid:3f4fe771-5062-4ca9-b1b0-866a84094468,Namespace:kube-system,Attempt:0,}" Mar 25 01:17:38.096929 containerd[1968]: time="2025-03-25T01:17:38.094524381Z" level=info msg="CreateContainer within sandbox \"3c44b6b3ff8e73571a7ec68fb4b4e997afcff93227db74dd0ba1479e3f3a1d1d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e8e50a5cd4968e7c3656035f17c59c91d664a285acfdb7ae4e6e814428060621\"" Mar 25 01:17:38.111978 containerd[1968]: time="2025-03-25T01:17:38.111815423Z" level=info msg="StartContainer for \"e8e50a5cd4968e7c3656035f17c59c91d664a285acfdb7ae4e6e814428060621\"" Mar 25 01:17:38.119180 containerd[1968]: time="2025-03-25T01:17:38.119109610Z" level=info msg="connecting to shim e8e50a5cd4968e7c3656035f17c59c91d664a285acfdb7ae4e6e814428060621" address="unix:///run/containerd/s/cadfad4a27ad591ac5904a6c18cf9c858fb979728a8b9b0bc3719062d399466d" protocol=ttrpc version=3 Mar 25 01:17:38.222351 containerd[1968]: time="2025-03-25T01:17:38.222278820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5f598cd48-7hrsb,Uid:d36c084d-ec62-4d75-89f3-585719967708,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967\"" Mar 25 01:17:38.271568 systemd[1]: Started cri-containerd-e8e50a5cd4968e7c3656035f17c59c91d664a285acfdb7ae4e6e814428060621.scope - libcontainer container e8e50a5cd4968e7c3656035f17c59c91d664a285acfdb7ae4e6e814428060621. Mar 25 01:17:38.305074 sshd[5212]: Connection closed by 147.75.109.163 port 36874 Mar 25 01:17:38.306772 sshd-session[5151]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:38.319911 systemd[1]: sshd@10-172.31.24.136:22-147.75.109.163:36874.service: Deactivated successfully. Mar 25 01:17:38.330591 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:17:38.360401 systemd-logind[1939]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:17:38.365439 systemd[1]: Started sshd@11-172.31.24.136:22-147.75.109.163:36880.service - OpenSSH per-connection server daemon (147.75.109.163:36880). Mar 25 01:17:38.371663 systemd-logind[1939]: Removed session 11. Mar 25 01:17:38.482195 containerd[1968]: time="2025-03-25T01:17:38.481490690Z" level=info msg="StartContainer for \"e8e50a5cd4968e7c3656035f17c59c91d664a285acfdb7ae4e6e814428060621\" returns successfully" Mar 25 01:17:38.608408 systemd-networkd[1869]: cali377f5d5bcda: Link UP Mar 25 01:17:38.611008 systemd-networkd[1869]: cali377f5d5bcda: Gained carrier Mar 25 01:17:38.618549 sshd[5318]: Accepted publickey for core from 147.75.109.163 port 36880 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:38.625764 sshd-session[5318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:38.650520 systemd-logind[1939]: New session 12 of user core. Mar 25 01:17:38.658549 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.352 [INFO][5282] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0 coredns-7db6d8ff4d- kube-system 3f4fe771-5062-4ca9-b1b0-866a84094468 727 0 2025-03-25 01:17:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-24-136 coredns-7db6d8ff4d-jnw4t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali377f5d5bcda [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.352 [INFO][5282] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.497 [INFO][5321] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" HandleID="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Workload="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.526 [INFO][5321] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" HandleID="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Workload="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400041bd30), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-24-136", "pod":"coredns-7db6d8ff4d-jnw4t", "timestamp":"2025-03-25 01:17:38.497895052 +0000 UTC"}, Hostname:"ip-172-31-24-136", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.526 [INFO][5321] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.526 [INFO][5321] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.526 [INFO][5321] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-24-136' Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.533 [INFO][5321] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.543 [INFO][5321] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.554 [INFO][5321] ipam/ipam.go 489: Trying affinity for 192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.559 [INFO][5321] ipam/ipam.go 155: Attempting to load block cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.565 [INFO][5321] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.105.64/26 host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.565 [INFO][5321] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.105.64/26 handle="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.569 [INFO][5321] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5 Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.578 [INFO][5321] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.105.64/26 handle="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.594 [INFO][5321] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.105.70/26] block=192.168.105.64/26 handle="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.594 [INFO][5321] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.105.70/26] handle="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" host="ip-172-31-24-136" Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.594 [INFO][5321] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:17:38.665387 containerd[1968]: 2025-03-25 01:17:38.594 [INFO][5321] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.70/26] IPv6=[] ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" HandleID="k8s-pod-network.3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Workload="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" Mar 25 01:17:38.667600 containerd[1968]: 2025-03-25 01:17:38.598 [INFO][5282] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3f4fe771-5062-4ca9-b1b0-866a84094468", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"", Pod:"coredns-7db6d8ff4d-jnw4t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali377f5d5bcda", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:38.667600 containerd[1968]: 2025-03-25 01:17:38.598 [INFO][5282] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.105.70/32] ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" Mar 25 01:17:38.667600 containerd[1968]: 2025-03-25 01:17:38.598 [INFO][5282] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali377f5d5bcda ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" Mar 25 01:17:38.667600 containerd[1968]: 2025-03-25 01:17:38.614 [INFO][5282] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" Mar 25 01:17:38.667600 containerd[1968]: 2025-03-25 01:17:38.617 [INFO][5282] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3f4fe771-5062-4ca9-b1b0-866a84094468", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 17, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-24-136", ContainerID:"3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5", Pod:"coredns-7db6d8ff4d-jnw4t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali377f5d5bcda", MAC:"26:67:f9:d2:22:2a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:17:38.667600 containerd[1968]: 2025-03-25 01:17:38.659 [INFO][5282] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnw4t" WorkloadEndpoint="ip--172--31--24--136-k8s-coredns--7db6d8ff4d--jnw4t-eth0" Mar 25 01:17:38.735255 containerd[1968]: time="2025-03-25T01:17:38.734951773Z" level=info msg="connecting to shim 3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5" address="unix:///run/containerd/s/c599d84533e2fa8bdfb6fa2dbdc044334b0a03e07bea955a7bfe5a459c0ff60d" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:17:38.755518 systemd-networkd[1869]: cali4f79511a805: Gained IPv6LL Mar 25 01:17:38.822551 systemd[1]: Started cri-containerd-3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5.scope - libcontainer container 3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5. Mar 25 01:17:38.967017 containerd[1968]: time="2025-03-25T01:17:38.966888807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnw4t,Uid:3f4fe771-5062-4ca9-b1b0-866a84094468,Namespace:kube-system,Attempt:0,} returns sandbox id \"3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5\"" Mar 25 01:17:38.978631 containerd[1968]: time="2025-03-25T01:17:38.978167845Z" level=info msg="CreateContainer within sandbox \"3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:17:39.001763 containerd[1968]: time="2025-03-25T01:17:39.000676169Z" level=info msg="Container 4616017ca2a3686219a87fe0f0b83007ac4b63d08e5cdfc15e5cb1fde1e86b02: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:39.017547 containerd[1968]: time="2025-03-25T01:17:39.017496279Z" level=info msg="CreateContainer within sandbox \"3953857421a11616f5b91ae7a5801e426f08badf8d81ad3e6c75238955dd5ce5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4616017ca2a3686219a87fe0f0b83007ac4b63d08e5cdfc15e5cb1fde1e86b02\"" Mar 25 01:17:39.019012 containerd[1968]: time="2025-03-25T01:17:39.018942563Z" level=info msg="StartContainer for \"4616017ca2a3686219a87fe0f0b83007ac4b63d08e5cdfc15e5cb1fde1e86b02\"" Mar 25 01:17:39.021374 containerd[1968]: time="2025-03-25T01:17:39.021319318Z" level=info msg="connecting to shim 4616017ca2a3686219a87fe0f0b83007ac4b63d08e5cdfc15e5cb1fde1e86b02" address="unix:///run/containerd/s/c599d84533e2fa8bdfb6fa2dbdc044334b0a03e07bea955a7bfe5a459c0ff60d" protocol=ttrpc version=3 Mar 25 01:17:39.036827 sshd[5345]: Connection closed by 147.75.109.163 port 36880 Mar 25 01:17:39.037786 sshd-session[5318]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:39.046383 systemd[1]: sshd@11-172.31.24.136:22-147.75.109.163:36880.service: Deactivated successfully. Mar 25 01:17:39.051968 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:17:39.055981 systemd-logind[1939]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:17:39.070580 systemd[1]: Started cri-containerd-4616017ca2a3686219a87fe0f0b83007ac4b63d08e5cdfc15e5cb1fde1e86b02.scope - libcontainer container 4616017ca2a3686219a87fe0f0b83007ac4b63d08e5cdfc15e5cb1fde1e86b02. Mar 25 01:17:39.071855 systemd-logind[1939]: Removed session 12. Mar 25 01:17:39.155806 containerd[1968]: time="2025-03-25T01:17:39.155596611Z" level=info msg="StartContainer for \"4616017ca2a3686219a87fe0f0b83007ac4b63d08e5cdfc15e5cb1fde1e86b02\" returns successfully" Mar 25 01:17:39.464150 kubelet[3335]: I0325 01:17:39.459782 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-jnw4t" podStartSLOduration=39.459758934 podStartE2EDuration="39.459758934s" podCreationTimestamp="2025-03-25 01:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:17:39.459464314 +0000 UTC m=+54.640895037" watchObservedRunningTime="2025-03-25 01:17:39.459758934 +0000 UTC m=+54.641189622" Mar 25 01:17:39.523961 systemd-networkd[1869]: cali32091b6f5c4: Gained IPv6LL Mar 25 01:17:39.554879 kubelet[3335]: I0325 01:17:39.552947 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-l2gdf" podStartSLOduration=39.552922749 podStartE2EDuration="39.552922749s" podCreationTimestamp="2025-03-25 01:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:17:39.552091901 +0000 UTC m=+54.733522565" watchObservedRunningTime="2025-03-25 01:17:39.552922749 +0000 UTC m=+54.734353413" Mar 25 01:17:40.355548 systemd-networkd[1869]: cali377f5d5bcda: Gained IPv6LL Mar 25 01:17:40.403130 containerd[1968]: time="2025-03-25T01:17:40.402598221Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:40.404995 containerd[1968]: time="2025-03-25T01:17:40.404902173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 25 01:17:40.407655 containerd[1968]: time="2025-03-25T01:17:40.407545794Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:40.412008 containerd[1968]: time="2025-03-25T01:17:40.411907569Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:40.413718 containerd[1968]: time="2025-03-25T01:17:40.413154009Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 4.133627045s" Mar 25 01:17:40.413718 containerd[1968]: time="2025-03-25T01:17:40.413210345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:17:40.416581 containerd[1968]: time="2025-03-25T01:17:40.415588011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:17:40.418798 containerd[1968]: time="2025-03-25T01:17:40.417466728Z" level=info msg="CreateContainer within sandbox \"7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:17:40.439327 containerd[1968]: time="2025-03-25T01:17:40.436601404Z" level=info msg="Container 4caec0037c4541b6c3bc98389e38d641310f92525298bd7fd0626847717bc671: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:40.460309 containerd[1968]: time="2025-03-25T01:17:40.459998711Z" level=info msg="CreateContainer within sandbox \"7a8173d1620ee3bf8f3fc48ae34aca1910561e70b13aab9691d180a4f08c2998\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4caec0037c4541b6c3bc98389e38d641310f92525298bd7fd0626847717bc671\"" Mar 25 01:17:40.463250 containerd[1968]: time="2025-03-25T01:17:40.463185661Z" level=info msg="StartContainer for \"4caec0037c4541b6c3bc98389e38d641310f92525298bd7fd0626847717bc671\"" Mar 25 01:17:40.466212 containerd[1968]: time="2025-03-25T01:17:40.466136892Z" level=info msg="connecting to shim 4caec0037c4541b6c3bc98389e38d641310f92525298bd7fd0626847717bc671" address="unix:///run/containerd/s/737d753615999a91fa029fb578ae0d6d902f315b5323a1a374023ddcf498a934" protocol=ttrpc version=3 Mar 25 01:17:40.506552 systemd[1]: Started cri-containerd-4caec0037c4541b6c3bc98389e38d641310f92525298bd7fd0626847717bc671.scope - libcontainer container 4caec0037c4541b6c3bc98389e38d641310f92525298bd7fd0626847717bc671. Mar 25 01:17:40.604574 containerd[1968]: time="2025-03-25T01:17:40.604511803Z" level=info msg="StartContainer for \"4caec0037c4541b6c3bc98389e38d641310f92525298bd7fd0626847717bc671\" returns successfully" Mar 25 01:17:41.485793 kubelet[3335]: I0325 01:17:41.485463 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5f598cd48-9mcjt" podStartSLOduration=25.863334095 podStartE2EDuration="31.485415885s" podCreationTimestamp="2025-03-25 01:17:10 +0000 UTC" firstStartedPulling="2025-03-25 01:17:34.792691229 +0000 UTC m=+49.974121881" lastFinishedPulling="2025-03-25 01:17:40.414773007 +0000 UTC m=+55.596203671" observedRunningTime="2025-03-25 01:17:41.482823298 +0000 UTC m=+56.664253986" watchObservedRunningTime="2025-03-25 01:17:41.485415885 +0000 UTC m=+56.666846549" Mar 25 01:17:42.155890 containerd[1968]: time="2025-03-25T01:17:42.155808043Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:42.158271 containerd[1968]: time="2025-03-25T01:17:42.158162957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 25 01:17:42.163469 containerd[1968]: time="2025-03-25T01:17:42.161218093Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:42.168457 containerd[1968]: time="2025-03-25T01:17:42.168331722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:42.171167 containerd[1968]: time="2025-03-25T01:17:42.171064255Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.754751182s" Mar 25 01:17:42.171167 containerd[1968]: time="2025-03-25T01:17:42.171153971Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 25 01:17:42.174109 containerd[1968]: time="2025-03-25T01:17:42.174043613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:17:42.177312 containerd[1968]: time="2025-03-25T01:17:42.176631738Z" level=info msg="CreateContainer within sandbox \"7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:17:42.203531 containerd[1968]: time="2025-03-25T01:17:42.200658551Z" level=info msg="Container 81064306742758b2daae2a9544e9e8ffe50ae2b10b2398341345b322387c3a47: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:42.232181 containerd[1968]: time="2025-03-25T01:17:42.231694657Z" level=info msg="CreateContainer within sandbox \"7e5b654e8cc7539b1cd25e66d9a5d367a8465e89903ba6206d91f05484570a1c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"81064306742758b2daae2a9544e9e8ffe50ae2b10b2398341345b322387c3a47\"" Mar 25 01:17:42.233465 containerd[1968]: time="2025-03-25T01:17:42.233396642Z" level=info msg="StartContainer for \"81064306742758b2daae2a9544e9e8ffe50ae2b10b2398341345b322387c3a47\"" Mar 25 01:17:42.236371 containerd[1968]: time="2025-03-25T01:17:42.236304911Z" level=info msg="connecting to shim 81064306742758b2daae2a9544e9e8ffe50ae2b10b2398341345b322387c3a47" address="unix:///run/containerd/s/79fcdcd104765b9ebc545f07f6cd785a05c20f8909d9b0e0bcb54500d979ad7a" protocol=ttrpc version=3 Mar 25 01:17:42.285347 systemd[1]: Started cri-containerd-81064306742758b2daae2a9544e9e8ffe50ae2b10b2398341345b322387c3a47.scope - libcontainer container 81064306742758b2daae2a9544e9e8ffe50ae2b10b2398341345b322387c3a47. Mar 25 01:17:42.440786 containerd[1968]: time="2025-03-25T01:17:42.440617784Z" level=info msg="StartContainer for \"81064306742758b2daae2a9544e9e8ffe50ae2b10b2398341345b322387c3a47\" returns successfully" Mar 25 01:17:42.474252 kubelet[3335]: I0325 01:17:42.472140 3335 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:17:42.506705 kubelet[3335]: I0325 01:17:42.506610 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-rcpm6" podStartSLOduration=23.117600283 podStartE2EDuration="30.506565066s" podCreationTimestamp="2025-03-25 01:17:12 +0000 UTC" firstStartedPulling="2025-03-25 01:17:34.784115304 +0000 UTC m=+49.965545956" lastFinishedPulling="2025-03-25 01:17:42.173080075 +0000 UTC m=+57.354510739" observedRunningTime="2025-03-25 01:17:42.504893282 +0000 UTC m=+57.686323970" watchObservedRunningTime="2025-03-25 01:17:42.506565066 +0000 UTC m=+57.687995718" Mar 25 01:17:43.238870 kubelet[3335]: I0325 01:17:43.238724 3335 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:17:43.238870 kubelet[3335]: I0325 01:17:43.238773 3335 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:17:44.072323 systemd[1]: Started sshd@12-172.31.24.136:22-147.75.109.163:44198.service - OpenSSH per-connection server daemon (147.75.109.163:44198). Mar 25 01:17:44.286037 sshd[5544]: Accepted publickey for core from 147.75.109.163 port 44198 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:44.289168 sshd-session[5544]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:44.301544 systemd-logind[1939]: New session 13 of user core. Mar 25 01:17:44.309604 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:17:44.805909 sshd[5548]: Connection closed by 147.75.109.163 port 44198 Mar 25 01:17:44.807491 sshd-session[5544]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:44.818488 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:17:44.820940 systemd[1]: sshd@12-172.31.24.136:22-147.75.109.163:44198.service: Deactivated successfully. Mar 25 01:17:44.837307 systemd-logind[1939]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:17:44.841253 systemd-logind[1939]: Removed session 13. Mar 25 01:17:45.160258 ntpd[1931]: Listen normally on 8 vxlan.calico 192.168.105.64:123 Mar 25 01:17:45.162730 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 8 vxlan.calico 192.168.105.64:123 Mar 25 01:17:45.162730 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 9 vxlan.calico [fe80::64bc:7fff:fe49:2884%4]:123 Mar 25 01:17:45.162730 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 10 calic2adb87bd1a [fe80::ecee:eeff:feee:eeee%7]:123 Mar 25 01:17:45.162730 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 11 cali730be3dbcb0 [fe80::ecee:eeff:feee:eeee%8]:123 Mar 25 01:17:45.162730 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 12 cali76c03659801 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 25 01:17:45.162730 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 13 cali4f79511a805 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 25 01:17:45.162730 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 14 cali32091b6f5c4 [fe80::ecee:eeff:feee:eeee%11]:123 Mar 25 01:17:45.160412 ntpd[1931]: Listen normally on 9 vxlan.calico [fe80::64bc:7fff:fe49:2884%4]:123 Mar 25 01:17:45.160504 ntpd[1931]: Listen normally on 10 calic2adb87bd1a [fe80::ecee:eeff:feee:eeee%7]:123 Mar 25 01:17:45.161830 ntpd[1931]: Listen normally on 11 cali730be3dbcb0 [fe80::ecee:eeff:feee:eeee%8]:123 Mar 25 01:17:45.161919 ntpd[1931]: Listen normally on 12 cali76c03659801 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 25 01:17:45.161990 ntpd[1931]: Listen normally on 13 cali4f79511a805 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 25 01:17:45.162056 ntpd[1931]: Listen normally on 14 cali32091b6f5c4 [fe80::ecee:eeff:feee:eeee%11]:123 Mar 25 01:17:45.164831 ntpd[1931]: 25 Mar 01:17:45 ntpd[1931]: Listen normally on 15 cali377f5d5bcda [fe80::ecee:eeff:feee:eeee%12]:123 Mar 25 01:17:45.163662 ntpd[1931]: Listen normally on 15 cali377f5d5bcda [fe80::ecee:eeff:feee:eeee%12]:123 Mar 25 01:17:45.790311 containerd[1968]: time="2025-03-25T01:17:45.789934210Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:45.792369 containerd[1968]: time="2025-03-25T01:17:45.792275511Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 25 01:17:45.794943 containerd[1968]: time="2025-03-25T01:17:45.794901249Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:45.800001 containerd[1968]: time="2025-03-25T01:17:45.799898597Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:45.802424 containerd[1968]: time="2025-03-25T01:17:45.801703035Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 3.627596334s" Mar 25 01:17:45.802424 containerd[1968]: time="2025-03-25T01:17:45.801804924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 25 01:17:45.803972 containerd[1968]: time="2025-03-25T01:17:45.803620839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:17:45.827206 containerd[1968]: time="2025-03-25T01:17:45.826861265Z" level=info msg="CreateContainer within sandbox \"3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:17:45.847118 containerd[1968]: time="2025-03-25T01:17:45.847017470Z" level=info msg="Container 5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:45.864637 containerd[1968]: time="2025-03-25T01:17:45.864571793Z" level=info msg="CreateContainer within sandbox \"3be9326d82a099cde4b24e98d833985f85af10f97183c258d5eb68c6b65879c9\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\"" Mar 25 01:17:45.865820 containerd[1968]: time="2025-03-25T01:17:45.865264098Z" level=info msg="StartContainer for \"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\"" Mar 25 01:17:45.867917 containerd[1968]: time="2025-03-25T01:17:45.867835132Z" level=info msg="connecting to shim 5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c" address="unix:///run/containerd/s/7559c58145f607100b57a1c0aaf1087b3c4b1f03ad9e824e9c6bc88ebf1e09c0" protocol=ttrpc version=3 Mar 25 01:17:45.917946 systemd[1]: Started cri-containerd-5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c.scope - libcontainer container 5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c. Mar 25 01:17:46.004538 containerd[1968]: time="2025-03-25T01:17:46.004442151Z" level=info msg="StartContainer for \"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\" returns successfully" Mar 25 01:17:46.130178 containerd[1968]: time="2025-03-25T01:17:46.129576321Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:17:46.133733 containerd[1968]: time="2025-03-25T01:17:46.133647289Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:17:46.143772 containerd[1968]: time="2025-03-25T01:17:46.143586476Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 339.885841ms" Mar 25 01:17:46.144095 containerd[1968]: time="2025-03-25T01:17:46.144026536Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:17:46.150849 containerd[1968]: time="2025-03-25T01:17:46.150772178Z" level=info msg="CreateContainer within sandbox \"5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:17:46.173893 containerd[1968]: time="2025-03-25T01:17:46.171594398Z" level=info msg="Container 0caacac3bc50f24a0f711165f438ab1b413532c50200da487d8936f4dcbf53b6: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:17:46.195933 containerd[1968]: time="2025-03-25T01:17:46.195865324Z" level=info msg="CreateContainer within sandbox \"5ec4a8476e4878542e6e23205e133cff1c33775c759066bd55b63a8568d8c967\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0caacac3bc50f24a0f711165f438ab1b413532c50200da487d8936f4dcbf53b6\"" Mar 25 01:17:46.199153 containerd[1968]: time="2025-03-25T01:17:46.199106991Z" level=info msg="StartContainer for \"0caacac3bc50f24a0f711165f438ab1b413532c50200da487d8936f4dcbf53b6\"" Mar 25 01:17:46.202592 containerd[1968]: time="2025-03-25T01:17:46.202537322Z" level=info msg="connecting to shim 0caacac3bc50f24a0f711165f438ab1b413532c50200da487d8936f4dcbf53b6" address="unix:///run/containerd/s/34ad9d3aefe1d89e2ff2c29146094b9a1f9f7a732ebeaacccd14737c6f7e8aef" protocol=ttrpc version=3 Mar 25 01:17:46.238570 systemd[1]: Started cri-containerd-0caacac3bc50f24a0f711165f438ab1b413532c50200da487d8936f4dcbf53b6.scope - libcontainer container 0caacac3bc50f24a0f711165f438ab1b413532c50200da487d8936f4dcbf53b6. Mar 25 01:17:46.322869 containerd[1968]: time="2025-03-25T01:17:46.322820147Z" level=info msg="StartContainer for \"0caacac3bc50f24a0f711165f438ab1b413532c50200da487d8936f4dcbf53b6\" returns successfully" Mar 25 01:17:46.562371 kubelet[3335]: I0325 01:17:46.562293 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-564f5997bc-jhx5b" podStartSLOduration=25.632675752 podStartE2EDuration="34.562266181s" podCreationTimestamp="2025-03-25 01:17:12 +0000 UTC" firstStartedPulling="2025-03-25 01:17:36.873386356 +0000 UTC m=+52.054817032" lastFinishedPulling="2025-03-25 01:17:45.802976798 +0000 UTC m=+60.984407461" observedRunningTime="2025-03-25 01:17:46.534695237 +0000 UTC m=+61.716125925" watchObservedRunningTime="2025-03-25 01:17:46.562266181 +0000 UTC m=+61.743696869" Mar 25 01:17:46.564894 kubelet[3335]: I0325 01:17:46.564472 3335 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5f598cd48-7hrsb" podStartSLOduration=28.649648034 podStartE2EDuration="36.564449725s" podCreationTimestamp="2025-03-25 01:17:10 +0000 UTC" firstStartedPulling="2025-03-25 01:17:38.230891783 +0000 UTC m=+53.412322435" lastFinishedPulling="2025-03-25 01:17:46.145693462 +0000 UTC m=+61.327124126" observedRunningTime="2025-03-25 01:17:46.560081246 +0000 UTC m=+61.741511898" watchObservedRunningTime="2025-03-25 01:17:46.564449725 +0000 UTC m=+61.745880413" Mar 25 01:17:46.663111 containerd[1968]: time="2025-03-25T01:17:46.663059296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\" id:\"38ca2772bec9ab5ce3dd5423454957adb8bbb7868c71b4d9507f4047060a4322\" pid:5647 exited_at:{seconds:1742865466 nanos:661820687}" Mar 25 01:17:47.518987 kubelet[3335]: I0325 01:17:47.517878 3335 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:17:49.844397 systemd[1]: Started sshd@13-172.31.24.136:22-147.75.109.163:44204.service - OpenSSH per-connection server daemon (147.75.109.163:44204). Mar 25 01:17:50.053984 sshd[5663]: Accepted publickey for core from 147.75.109.163 port 44204 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:50.057120 sshd-session[5663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:50.066218 systemd-logind[1939]: New session 14 of user core. Mar 25 01:17:50.072491 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:17:50.333457 sshd[5665]: Connection closed by 147.75.109.163 port 44204 Mar 25 01:17:50.333938 sshd-session[5663]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:50.341383 systemd[1]: sshd@13-172.31.24.136:22-147.75.109.163:44204.service: Deactivated successfully. Mar 25 01:17:50.347937 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:17:50.349784 systemd-logind[1939]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:17:50.352219 systemd-logind[1939]: Removed session 14. Mar 25 01:17:50.937426 kubelet[3335]: I0325 01:17:50.937193 3335 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:17:52.415555 containerd[1968]: time="2025-03-25T01:17:52.415447293Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\" id:\"e994fbea36c42143ec11e2e723441e21126f7dd41e69a5d7c38b86a9b4eef5fe\" pid:5693 exited_at:{seconds:1742865472 nanos:414161469}" Mar 25 01:17:54.552289 kubelet[3335]: I0325 01:17:54.551770 3335 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:17:55.375648 systemd[1]: Started sshd@14-172.31.24.136:22-147.75.109.163:58846.service - OpenSSH per-connection server daemon (147.75.109.163:58846). Mar 25 01:17:55.584387 sshd[5711]: Accepted publickey for core from 147.75.109.163 port 58846 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:17:55.589086 sshd-session[5711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:55.608328 systemd-logind[1939]: New session 15 of user core. Mar 25 01:17:55.616887 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:17:55.878686 sshd[5714]: Connection closed by 147.75.109.163 port 58846 Mar 25 01:17:55.879734 sshd-session[5711]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:55.886152 systemd[1]: sshd@14-172.31.24.136:22-147.75.109.163:58846.service: Deactivated successfully. Mar 25 01:17:55.892553 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:17:55.894437 systemd-logind[1939]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:17:55.896428 systemd-logind[1939]: Removed session 15. Mar 25 01:17:57.457260 containerd[1968]: time="2025-03-25T01:17:57.457172750Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\" id:\"c68e0b6850844b151d0470c7461c87f6bde7a07bb1da661989d27bf4af02cba6\" pid:5737 exited_at:{seconds:1742865477 nanos:456558818}" Mar 25 01:18:00.914572 systemd[1]: Started sshd@15-172.31.24.136:22-147.75.109.163:41326.service - OpenSSH per-connection server daemon (147.75.109.163:41326). Mar 25 01:18:01.116756 sshd[5749]: Accepted publickey for core from 147.75.109.163 port 41326 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:01.119886 sshd-session[5749]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:01.128962 systemd-logind[1939]: New session 16 of user core. Mar 25 01:18:01.137520 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:18:01.391853 sshd[5751]: Connection closed by 147.75.109.163 port 41326 Mar 25 01:18:01.392806 sshd-session[5749]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:01.398750 systemd[1]: sshd@15-172.31.24.136:22-147.75.109.163:41326.service: Deactivated successfully. Mar 25 01:18:01.403500 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:18:01.407245 systemd-logind[1939]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:18:01.410114 systemd-logind[1939]: Removed session 16. Mar 25 01:18:01.426452 systemd[1]: Started sshd@16-172.31.24.136:22-147.75.109.163:41328.service - OpenSSH per-connection server daemon (147.75.109.163:41328). Mar 25 01:18:01.617985 sshd[5764]: Accepted publickey for core from 147.75.109.163 port 41328 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:01.620557 sshd-session[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:01.629732 systemd-logind[1939]: New session 17 of user core. Mar 25 01:18:01.639504 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:18:02.105807 sshd[5766]: Connection closed by 147.75.109.163 port 41328 Mar 25 01:18:02.107316 sshd-session[5764]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:02.113535 systemd[1]: sshd@16-172.31.24.136:22-147.75.109.163:41328.service: Deactivated successfully. Mar 25 01:18:02.118089 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:18:02.121501 systemd-logind[1939]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:18:02.124262 systemd-logind[1939]: Removed session 17. Mar 25 01:18:02.145895 systemd[1]: Started sshd@17-172.31.24.136:22-147.75.109.163:41334.service - OpenSSH per-connection server daemon (147.75.109.163:41334). Mar 25 01:18:02.352247 sshd[5776]: Accepted publickey for core from 147.75.109.163 port 41334 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:02.355094 sshd-session[5776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:02.371422 systemd-logind[1939]: New session 18 of user core. Mar 25 01:18:02.376576 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:18:05.673489 sshd[5778]: Connection closed by 147.75.109.163 port 41334 Mar 25 01:18:05.673906 sshd-session[5776]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:05.684974 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:18:05.686153 systemd[1]: session-18.scope: Consumed 1.048s CPU time, 64.3M memory peak. Mar 25 01:18:05.688502 systemd[1]: sshd@17-172.31.24.136:22-147.75.109.163:41334.service: Deactivated successfully. Mar 25 01:18:05.703918 systemd-logind[1939]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:18:05.752766 systemd[1]: Started sshd@18-172.31.24.136:22-147.75.109.163:41338.service - OpenSSH per-connection server daemon (147.75.109.163:41338). Mar 25 01:18:05.762179 systemd-logind[1939]: Removed session 18. Mar 25 01:18:05.972043 sshd[5794]: Accepted publickey for core from 147.75.109.163 port 41338 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:05.974845 sshd-session[5794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:05.985320 systemd-logind[1939]: New session 19 of user core. Mar 25 01:18:05.992623 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:18:06.516708 sshd[5797]: Connection closed by 147.75.109.163 port 41338 Mar 25 01:18:06.517188 sshd-session[5794]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:06.529383 systemd[1]: sshd@18-172.31.24.136:22-147.75.109.163:41338.service: Deactivated successfully. Mar 25 01:18:06.534633 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:18:06.539331 systemd-logind[1939]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:18:06.561098 systemd[1]: Started sshd@19-172.31.24.136:22-147.75.109.163:41346.service - OpenSSH per-connection server daemon (147.75.109.163:41346). Mar 25 01:18:06.563805 systemd-logind[1939]: Removed session 19. Mar 25 01:18:06.602066 containerd[1968]: time="2025-03-25T01:18:06.601842827Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a\" id:\"fe889a71864f65bd3fdae885e2b005474a1ce752e2fc845a5038b42b2649fe53\" pid:5816 exited_at:{seconds:1742865486 nanos:600902807}" Mar 25 01:18:06.774723 sshd[5830]: Accepted publickey for core from 147.75.109.163 port 41346 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:06.779375 sshd-session[5830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:06.795860 systemd-logind[1939]: New session 20 of user core. Mar 25 01:18:06.806508 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:18:07.056399 sshd[5834]: Connection closed by 147.75.109.163 port 41346 Mar 25 01:18:07.057217 sshd-session[5830]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:07.062788 systemd[1]: sshd@19-172.31.24.136:22-147.75.109.163:41346.service: Deactivated successfully. Mar 25 01:18:07.065937 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:18:07.072103 systemd-logind[1939]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:18:07.075400 systemd-logind[1939]: Removed session 20. Mar 25 01:18:12.095718 systemd[1]: Started sshd@20-172.31.24.136:22-147.75.109.163:48030.service - OpenSSH per-connection server daemon (147.75.109.163:48030). Mar 25 01:18:12.301655 sshd[5849]: Accepted publickey for core from 147.75.109.163 port 48030 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:12.304915 sshd-session[5849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:12.318344 systemd-logind[1939]: New session 21 of user core. Mar 25 01:18:12.324828 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:18:12.596198 sshd[5854]: Connection closed by 147.75.109.163 port 48030 Mar 25 01:18:12.597477 sshd-session[5849]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:12.604794 systemd[1]: sshd@20-172.31.24.136:22-147.75.109.163:48030.service: Deactivated successfully. Mar 25 01:18:12.610949 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:18:12.614679 systemd-logind[1939]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:18:12.617601 systemd-logind[1939]: Removed session 21. Mar 25 01:18:17.637771 systemd[1]: Started sshd@21-172.31.24.136:22-147.75.109.163:48038.service - OpenSSH per-connection server daemon (147.75.109.163:48038). Mar 25 01:18:17.842615 sshd[5873]: Accepted publickey for core from 147.75.109.163 port 48038 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:17.846113 sshd-session[5873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:17.855462 systemd-logind[1939]: New session 22 of user core. Mar 25 01:18:17.862579 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 01:18:18.128116 sshd[5875]: Connection closed by 147.75.109.163 port 48038 Mar 25 01:18:18.129699 sshd-session[5873]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:18.135905 systemd-logind[1939]: Session 22 logged out. Waiting for processes to exit. Mar 25 01:18:18.137543 systemd[1]: sshd@21-172.31.24.136:22-147.75.109.163:48038.service: Deactivated successfully. Mar 25 01:18:18.143188 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 01:18:18.151017 systemd-logind[1939]: Removed session 22. Mar 25 01:18:22.423448 containerd[1968]: time="2025-03-25T01:18:22.423006350Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\" id:\"608f8479b5d01f671db7c80d4188141dfdece13e80ab22587cc99766baea27e2\" pid:5900 exited_at:{seconds:1742865502 nanos:422286962}" Mar 25 01:18:23.167553 systemd[1]: Started sshd@22-172.31.24.136:22-147.75.109.163:34832.service - OpenSSH per-connection server daemon (147.75.109.163:34832). Mar 25 01:18:23.369143 sshd[5910]: Accepted publickey for core from 147.75.109.163 port 34832 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:23.371755 sshd-session[5910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:23.381715 systemd-logind[1939]: New session 23 of user core. Mar 25 01:18:23.386503 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 01:18:23.629862 sshd[5912]: Connection closed by 147.75.109.163 port 34832 Mar 25 01:18:23.630546 sshd-session[5910]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:23.639443 systemd[1]: sshd@22-172.31.24.136:22-147.75.109.163:34832.service: Deactivated successfully. Mar 25 01:18:23.645587 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 01:18:23.648115 systemd-logind[1939]: Session 23 logged out. Waiting for processes to exit. Mar 25 01:18:23.650332 systemd-logind[1939]: Removed session 23. Mar 25 01:18:28.666124 systemd[1]: Started sshd@23-172.31.24.136:22-147.75.109.163:34834.service - OpenSSH per-connection server daemon (147.75.109.163:34834). Mar 25 01:18:28.862599 sshd[5923]: Accepted publickey for core from 147.75.109.163 port 34834 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:28.865263 sshd-session[5923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:28.875605 systemd-logind[1939]: New session 24 of user core. Mar 25 01:18:28.885483 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 25 01:18:29.129254 sshd[5925]: Connection closed by 147.75.109.163 port 34834 Mar 25 01:18:29.130356 sshd-session[5923]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:29.138279 systemd[1]: sshd@23-172.31.24.136:22-147.75.109.163:34834.service: Deactivated successfully. Mar 25 01:18:29.142658 systemd[1]: session-24.scope: Deactivated successfully. Mar 25 01:18:29.144427 systemd-logind[1939]: Session 24 logged out. Waiting for processes to exit. Mar 25 01:18:29.147446 systemd-logind[1939]: Removed session 24. Mar 25 01:18:34.166435 systemd[1]: Started sshd@24-172.31.24.136:22-147.75.109.163:44940.service - OpenSSH per-connection server daemon (147.75.109.163:44940). Mar 25 01:18:34.364443 sshd[5939]: Accepted publickey for core from 147.75.109.163 port 44940 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:34.367105 sshd-session[5939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:34.375347 systemd-logind[1939]: New session 25 of user core. Mar 25 01:18:34.381540 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 25 01:18:34.628877 sshd[5941]: Connection closed by 147.75.109.163 port 44940 Mar 25 01:18:34.628586 sshd-session[5939]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:34.634964 systemd[1]: sshd@24-172.31.24.136:22-147.75.109.163:44940.service: Deactivated successfully. Mar 25 01:18:34.638720 systemd[1]: session-25.scope: Deactivated successfully. Mar 25 01:18:34.643042 systemd-logind[1939]: Session 25 logged out. Waiting for processes to exit. Mar 25 01:18:34.645215 systemd-logind[1939]: Removed session 25. Mar 25 01:18:36.537658 containerd[1968]: time="2025-03-25T01:18:36.537554404Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33f7ea9b63729d227bc50bfa69cc02395af443fa4b77c30c48e4fb609d88959a\" id:\"8802f2cd065a74a09bf7311e055525759af6a8f3295dfea54a4535591f894455\" pid:5965 exited_at:{seconds:1742865516 nanos:537113848}" Mar 25 01:18:39.666923 systemd[1]: Started sshd@25-172.31.24.136:22-147.75.109.163:44946.service - OpenSSH per-connection server daemon (147.75.109.163:44946). Mar 25 01:18:39.866912 sshd[5982]: Accepted publickey for core from 147.75.109.163 port 44946 ssh2: RSA SHA256:PeYl6nTqnDkQzDdjcMK19FTRt4hKUhyBe4JyMaX2oCc Mar 25 01:18:39.869474 sshd-session[5982]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:18:39.878102 systemd-logind[1939]: New session 26 of user core. Mar 25 01:18:39.890490 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 25 01:18:40.129338 sshd[5984]: Connection closed by 147.75.109.163 port 44946 Mar 25 01:18:40.130546 sshd-session[5982]: pam_unix(sshd:session): session closed for user core Mar 25 01:18:40.135925 systemd[1]: session-26.scope: Deactivated successfully. Mar 25 01:18:40.137604 systemd[1]: sshd@25-172.31.24.136:22-147.75.109.163:44946.service: Deactivated successfully. Mar 25 01:18:40.143973 systemd-logind[1939]: Session 26 logged out. Waiting for processes to exit. Mar 25 01:18:40.145893 systemd-logind[1939]: Removed session 26. Mar 25 01:18:52.406166 containerd[1968]: time="2025-03-25T01:18:52.405668923Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\" id:\"21b57e3985739e152e9e8f6cb0e591d047d869617dceacf7d32db9f1a55b9404\" pid:6009 exit_status:1 exited_at:{seconds:1742865532 nanos:402030727}" Mar 25 01:18:54.786924 systemd[1]: cri-containerd-47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341.scope: Deactivated successfully. Mar 25 01:18:54.788639 systemd[1]: cri-containerd-47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341.scope: Consumed 5.355s CPU time, 61.4M memory peak, 104K read from disk. Mar 25 01:18:54.793045 containerd[1968]: time="2025-03-25T01:18:54.792707003Z" level=info msg="received exit event container_id:\"47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341\" id:\"47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341\" pid:3185 exit_status:1 exited_at:{seconds:1742865534 nanos:791977234}" Mar 25 01:18:54.794884 containerd[1968]: time="2025-03-25T01:18:54.794171771Z" level=info msg="TaskExit event in podsandbox handler container_id:\"47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341\" id:\"47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341\" pid:3185 exit_status:1 exited_at:{seconds:1742865534 nanos:791977234}" Mar 25 01:18:54.837661 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341-rootfs.mount: Deactivated successfully. Mar 25 01:18:55.188508 systemd[1]: cri-containerd-420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b.scope: Deactivated successfully. Mar 25 01:18:55.190474 systemd[1]: cri-containerd-420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b.scope: Consumed 7.365s CPU time, 46.6M memory peak, 356K read from disk. Mar 25 01:18:55.198277 containerd[1968]: time="2025-03-25T01:18:55.197613813Z" level=info msg="received exit event container_id:\"420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b\" id:\"420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b\" pid:3865 exit_status:1 exited_at:{seconds:1742865535 nanos:196931157}" Mar 25 01:18:55.198708 containerd[1968]: time="2025-03-25T01:18:55.198602829Z" level=info msg="TaskExit event in podsandbox handler container_id:\"420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b\" id:\"420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b\" pid:3865 exit_status:1 exited_at:{seconds:1742865535 nanos:196931157}" Mar 25 01:18:55.246059 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b-rootfs.mount: Deactivated successfully. Mar 25 01:18:55.736101 kubelet[3335]: I0325 01:18:55.735994 3335 scope.go:117] "RemoveContainer" containerID="420b7e32f343919403df4e8e16c81463f5e4f779f936ed051ca36168a6b9ca6b" Mar 25 01:18:55.741931 kubelet[3335]: I0325 01:18:55.740806 3335 scope.go:117] "RemoveContainer" containerID="47e40a9f6df5fdd1afa520c6b24ea8739b068d164b7d1abdaaa4f3258f592341" Mar 25 01:18:55.742741 containerd[1968]: time="2025-03-25T01:18:55.741779087Z" level=info msg="CreateContainer within sandbox \"693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 25 01:18:55.747375 containerd[1968]: time="2025-03-25T01:18:55.747202475Z" level=info msg="CreateContainer within sandbox \"5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 25 01:18:55.762695 containerd[1968]: time="2025-03-25T01:18:55.762546359Z" level=info msg="Container 64ee1dda74ef330a683ba8b47443301356d0cd46bef6c6bbb5946fc13681cae1: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:18:55.780653 containerd[1968]: time="2025-03-25T01:18:55.778580447Z" level=info msg="Container cbf8b1cdb5529080f8c4afab0e1e487d5a6334516317041332288bc1236783de: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:18:55.787715 containerd[1968]: time="2025-03-25T01:18:55.787583723Z" level=info msg="CreateContainer within sandbox \"693d29c1cd15ca491e3b2af2ebd42fae1d50e0107f186496f73a1d0f616baf6d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"64ee1dda74ef330a683ba8b47443301356d0cd46bef6c6bbb5946fc13681cae1\"" Mar 25 01:18:55.788546 containerd[1968]: time="2025-03-25T01:18:55.788324723Z" level=info msg="StartContainer for \"64ee1dda74ef330a683ba8b47443301356d0cd46bef6c6bbb5946fc13681cae1\"" Mar 25 01:18:55.791816 containerd[1968]: time="2025-03-25T01:18:55.791734607Z" level=info msg="connecting to shim 64ee1dda74ef330a683ba8b47443301356d0cd46bef6c6bbb5946fc13681cae1" address="unix:///run/containerd/s/76ecd458d15578f8f4c19d37b27293806a43ac28d75ae7d843fb201245eabb95" protocol=ttrpc version=3 Mar 25 01:18:55.794352 containerd[1968]: time="2025-03-25T01:18:55.794132075Z" level=info msg="CreateContainer within sandbox \"5aee85ef2b0ed88483c492f18957ef3a7e320cf794acec9e366619b26988df41\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"cbf8b1cdb5529080f8c4afab0e1e487d5a6334516317041332288bc1236783de\"" Mar 25 01:18:55.795373 containerd[1968]: time="2025-03-25T01:18:55.794884895Z" level=info msg="StartContainer for \"cbf8b1cdb5529080f8c4afab0e1e487d5a6334516317041332288bc1236783de\"" Mar 25 01:18:55.798049 containerd[1968]: time="2025-03-25T01:18:55.797358263Z" level=info msg="connecting to shim cbf8b1cdb5529080f8c4afab0e1e487d5a6334516317041332288bc1236783de" address="unix:///run/containerd/s/c167ececfab706fe70c5d5028368abc32b242f7f8c7c9c284c7b681977adcae8" protocol=ttrpc version=3 Mar 25 01:18:55.834573 systemd[1]: Started cri-containerd-64ee1dda74ef330a683ba8b47443301356d0cd46bef6c6bbb5946fc13681cae1.scope - libcontainer container 64ee1dda74ef330a683ba8b47443301356d0cd46bef6c6bbb5946fc13681cae1. Mar 25 01:18:55.862587 systemd[1]: Started cri-containerd-cbf8b1cdb5529080f8c4afab0e1e487d5a6334516317041332288bc1236783de.scope - libcontainer container cbf8b1cdb5529080f8c4afab0e1e487d5a6334516317041332288bc1236783de. Mar 25 01:18:55.925219 containerd[1968]: time="2025-03-25T01:18:55.925152336Z" level=info msg="StartContainer for \"64ee1dda74ef330a683ba8b47443301356d0cd46bef6c6bbb5946fc13681cae1\" returns successfully" Mar 25 01:18:55.981445 containerd[1968]: time="2025-03-25T01:18:55.981378120Z" level=info msg="StartContainer for \"cbf8b1cdb5529080f8c4afab0e1e487d5a6334516317041332288bc1236783de\" returns successfully" Mar 25 01:18:57.468792 containerd[1968]: time="2025-03-25T01:18:57.468718992Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5cd35c498d2ba977c60df50b56006369a787c09266e14ec18b33288fb48f4e2c\" id:\"49c51c7c2bffccc8b3b51cbbb216281ecd4c8fab788c68dddea2961cb82f5548\" pid:6124 exit_status:1 exited_at:{seconds:1742865537 nanos:467903544}" Mar 25 01:18:57.517253 kubelet[3335]: E0325 01:18:57.516621 3335 controller.go:195] "Failed to update lease" err="Put \"https://172.31.24.136:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-24-136?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 25 01:18:59.399323 systemd[1]: cri-containerd-75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f.scope: Deactivated successfully. Mar 25 01:18:59.402385 systemd[1]: cri-containerd-75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f.scope: Consumed 2.267s CPU time, 19.9M memory peak, 76K read from disk. Mar 25 01:18:59.403565 containerd[1968]: time="2025-03-25T01:18:59.403327309Z" level=info msg="received exit event container_id:\"75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f\" id:\"75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f\" pid:3160 exit_status:1 exited_at:{seconds:1742865539 nanos:402567733}" Mar 25 01:18:59.404060 containerd[1968]: time="2025-03-25T01:18:59.403718437Z" level=info msg="TaskExit event in podsandbox handler container_id:\"75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f\" id:\"75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f\" pid:3160 exit_status:1 exited_at:{seconds:1742865539 nanos:402567733}" Mar 25 01:18:59.449385 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f-rootfs.mount: Deactivated successfully. Mar 25 01:18:59.767706 kubelet[3335]: I0325 01:18:59.767569 3335 scope.go:117] "RemoveContainer" containerID="75f6b5452996627a529931b19b5d2d2927ed1ae73ed104bbb1e9cc4a10352b3f" Mar 25 01:18:59.772831 containerd[1968]: time="2025-03-25T01:18:59.772774275Z" level=info msg="CreateContainer within sandbox \"6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 25 01:18:59.790370 containerd[1968]: time="2025-03-25T01:18:59.788739915Z" level=info msg="Container efa564f25ede83f26e0022ca30a2eb5b9adcff64d4736aeca218f5d270d6b3bd: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:18:59.806429 containerd[1968]: time="2025-03-25T01:18:59.806375283Z" level=info msg="CreateContainer within sandbox \"6a688400b8fc500a91ae7aef44757da1c97f68dae741c3de3ce4122a8dca504f\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"efa564f25ede83f26e0022ca30a2eb5b9adcff64d4736aeca218f5d270d6b3bd\"" Mar 25 01:18:59.807426 containerd[1968]: time="2025-03-25T01:18:59.807378543Z" level=info msg="StartContainer for \"efa564f25ede83f26e0022ca30a2eb5b9adcff64d4736aeca218f5d270d6b3bd\"" Mar 25 01:18:59.809771 containerd[1968]: time="2025-03-25T01:18:59.809706075Z" level=info msg="connecting to shim efa564f25ede83f26e0022ca30a2eb5b9adcff64d4736aeca218f5d270d6b3bd" address="unix:///run/containerd/s/bc7db725d8ccd97e812a4a93e6e78a6ec76c06dbd524487685651ddbda811501" protocol=ttrpc version=3 Mar 25 01:18:59.848564 systemd[1]: Started cri-containerd-efa564f25ede83f26e0022ca30a2eb5b9adcff64d4736aeca218f5d270d6b3bd.scope - libcontainer container efa564f25ede83f26e0022ca30a2eb5b9adcff64d4736aeca218f5d270d6b3bd. Mar 25 01:18:59.925846 containerd[1968]: time="2025-03-25T01:18:59.925717696Z" level=info msg="StartContainer for \"efa564f25ede83f26e0022ca30a2eb5b9adcff64d4736aeca218f5d270d6b3bd\" returns successfully"