Mar 25 01:30:22.190483 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Mar 25 01:30:22.190527 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Mon Mar 24 23:39:14 -00 2025 Mar 25 01:30:22.190552 kernel: KASLR disabled due to lack of seed Mar 25 01:30:22.190568 kernel: efi: EFI v2.7 by EDK II Mar 25 01:30:22.190583 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7a736a98 MEMRESERVE=0x78551598 Mar 25 01:30:22.190598 kernel: secureboot: Secure boot disabled Mar 25 01:30:22.190615 kernel: ACPI: Early table checksum verification disabled Mar 25 01:30:22.190631 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Mar 25 01:30:22.191237 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Mar 25 01:30:22.191267 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Mar 25 01:30:22.191291 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Mar 25 01:30:22.191307 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Mar 25 01:30:22.191322 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Mar 25 01:30:22.191337 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Mar 25 01:30:22.191355 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Mar 25 01:30:22.191376 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Mar 25 01:30:22.191392 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Mar 25 01:30:22.191408 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Mar 25 01:30:22.191423 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Mar 25 01:30:22.191439 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Mar 25 01:30:22.191455 kernel: printk: bootconsole [uart0] enabled Mar 25 01:30:22.191470 kernel: NUMA: Failed to initialise from firmware Mar 25 01:30:22.191486 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Mar 25 01:30:22.191503 kernel: NUMA: NODE_DATA [mem 0x4b583f800-0x4b5844fff] Mar 25 01:30:22.191518 kernel: Zone ranges: Mar 25 01:30:22.191534 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 25 01:30:22.191554 kernel: DMA32 empty Mar 25 01:30:22.191570 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Mar 25 01:30:22.191586 kernel: Movable zone start for each node Mar 25 01:30:22.191601 kernel: Early memory node ranges Mar 25 01:30:22.191617 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Mar 25 01:30:22.191632 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Mar 25 01:30:22.191648 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Mar 25 01:30:22.191663 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Mar 25 01:30:22.191678 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Mar 25 01:30:22.191694 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Mar 25 01:30:22.191709 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Mar 25 01:30:22.191725 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Mar 25 01:30:22.191744 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Mar 25 01:30:22.191761 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Mar 25 01:30:22.191784 kernel: psci: probing for conduit method from ACPI. Mar 25 01:30:22.191800 kernel: psci: PSCIv1.0 detected in firmware. Mar 25 01:30:22.191817 kernel: psci: Using standard PSCI v0.2 function IDs Mar 25 01:30:22.191838 kernel: psci: Trusted OS migration not required Mar 25 01:30:22.191854 kernel: psci: SMC Calling Convention v1.1 Mar 25 01:30:22.191871 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 25 01:30:22.191887 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 25 01:30:22.191905 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 25 01:30:22.191921 kernel: Detected PIPT I-cache on CPU0 Mar 25 01:30:22.191938 kernel: CPU features: detected: GIC system register CPU interface Mar 25 01:30:22.191954 kernel: CPU features: detected: Spectre-v2 Mar 25 01:30:22.191970 kernel: CPU features: detected: Spectre-v3a Mar 25 01:30:22.191987 kernel: CPU features: detected: Spectre-BHB Mar 25 01:30:22.192003 kernel: CPU features: detected: ARM erratum 1742098 Mar 25 01:30:22.192019 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Mar 25 01:30:22.192040 kernel: alternatives: applying boot alternatives Mar 25 01:30:22.192059 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:30:22.192077 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:30:22.192093 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 25 01:30:22.192110 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:30:22.192127 kernel: Fallback order for Node 0: 0 Mar 25 01:30:22.192143 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Mar 25 01:30:22.192159 kernel: Policy zone: Normal Mar 25 01:30:22.193216 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:30:22.193247 kernel: software IO TLB: area num 2. Mar 25 01:30:22.193272 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Mar 25 01:30:22.193290 kernel: Memory: 3821112K/4030464K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38464K init, 897K bss, 209352K reserved, 0K cma-reserved) Mar 25 01:30:22.193307 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 25 01:30:22.193324 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:30:22.193342 kernel: rcu: RCU event tracing is enabled. Mar 25 01:30:22.193359 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 25 01:30:22.193376 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:30:22.193393 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:30:22.193410 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:30:22.193426 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 25 01:30:22.193443 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 25 01:30:22.193464 kernel: GICv3: 96 SPIs implemented Mar 25 01:30:22.193480 kernel: GICv3: 0 Extended SPIs implemented Mar 25 01:30:22.193497 kernel: Root IRQ handler: gic_handle_irq Mar 25 01:30:22.193513 kernel: GICv3: GICv3 features: 16 PPIs Mar 25 01:30:22.193530 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Mar 25 01:30:22.193546 kernel: ITS [mem 0x10080000-0x1009ffff] Mar 25 01:30:22.193563 kernel: ITS@0x0000000010080000: allocated 8192 Devices @4000b0000 (indirect, esz 8, psz 64K, shr 1) Mar 25 01:30:22.193580 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000c0000 (flat, esz 8, psz 64K, shr 1) Mar 25 01:30:22.193596 kernel: GICv3: using LPI property table @0x00000004000d0000 Mar 25 01:30:22.193613 kernel: ITS: Using hypervisor restricted LPI range [128] Mar 25 01:30:22.193629 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000e0000 Mar 25 01:30:22.193646 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:30:22.193667 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Mar 25 01:30:22.193684 kernel: clocksource: arch_sys_counter: mask: 0x1ffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Mar 25 01:30:22.193701 kernel: sched_clock: 57 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Mar 25 01:30:22.193717 kernel: Console: colour dummy device 80x25 Mar 25 01:30:22.193735 kernel: printk: console [tty1] enabled Mar 25 01:30:22.193752 kernel: ACPI: Core revision 20230628 Mar 25 01:30:22.193769 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Mar 25 01:30:22.193786 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:30:22.193803 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:30:22.193820 kernel: landlock: Up and running. Mar 25 01:30:22.193841 kernel: SELinux: Initializing. Mar 25 01:30:22.193858 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:30:22.193875 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:30:22.193892 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:30:22.193909 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:30:22.193926 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:30:22.193943 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:30:22.193959 kernel: Platform MSI: ITS@0x10080000 domain created Mar 25 01:30:22.193981 kernel: PCI/MSI: ITS@0x10080000 domain created Mar 25 01:30:22.193998 kernel: Remapping and enabling EFI services. Mar 25 01:30:22.194014 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:30:22.194031 kernel: Detected PIPT I-cache on CPU1 Mar 25 01:30:22.194048 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Mar 25 01:30:22.194065 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000f0000 Mar 25 01:30:22.194081 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Mar 25 01:30:22.194098 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:30:22.194115 kernel: SMP: Total of 2 processors activated. Mar 25 01:30:22.194131 kernel: CPU features: detected: 32-bit EL0 Support Mar 25 01:30:22.194153 kernel: CPU features: detected: 32-bit EL1 Support Mar 25 01:30:22.194170 kernel: CPU features: detected: CRC32 instructions Mar 25 01:30:22.194245 kernel: CPU: All CPU(s) started at EL1 Mar 25 01:30:22.194268 kernel: alternatives: applying system-wide alternatives Mar 25 01:30:22.194286 kernel: devtmpfs: initialized Mar 25 01:30:22.194304 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:30:22.194322 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 25 01:30:22.194340 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:30:22.194358 kernel: SMBIOS 3.0.0 present. Mar 25 01:30:22.194380 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Mar 25 01:30:22.194397 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:30:22.194415 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 25 01:30:22.194433 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 25 01:30:22.194451 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 25 01:30:22.194469 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:30:22.194486 kernel: audit: type=2000 audit(0.225:1): state=initialized audit_enabled=0 res=1 Mar 25 01:30:22.194508 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:30:22.194526 kernel: cpuidle: using governor menu Mar 25 01:30:22.194544 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 25 01:30:22.194561 kernel: ASID allocator initialised with 65536 entries Mar 25 01:30:22.194579 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:30:22.194596 kernel: Serial: AMBA PL011 UART driver Mar 25 01:30:22.194614 kernel: Modules: 17728 pages in range for non-PLT usage Mar 25 01:30:22.194631 kernel: Modules: 509248 pages in range for PLT usage Mar 25 01:30:22.194648 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:30:22.194670 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:30:22.194688 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 25 01:30:22.194705 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 25 01:30:22.194724 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:30:22.194741 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:30:22.194759 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 25 01:30:22.194776 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 25 01:30:22.194794 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:30:22.194812 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:30:22.194833 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:30:22.194851 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:30:22.194869 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:30:22.194887 kernel: ACPI: Interpreter enabled Mar 25 01:30:22.194904 kernel: ACPI: Using GIC for interrupt routing Mar 25 01:30:22.194921 kernel: ACPI: MCFG table detected, 1 entries Mar 25 01:30:22.194939 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Mar 25 01:30:22.196351 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:30:22.196610 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:30:22.196812 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:30:22.197013 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Mar 25 01:30:22.198344 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Mar 25 01:30:22.198826 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Mar 25 01:30:22.199139 kernel: acpiphp: Slot [1] registered Mar 25 01:30:22.199164 kernel: acpiphp: Slot [2] registered Mar 25 01:30:22.199358 kernel: acpiphp: Slot [3] registered Mar 25 01:30:22.199389 kernel: acpiphp: Slot [4] registered Mar 25 01:30:22.199408 kernel: acpiphp: Slot [5] registered Mar 25 01:30:22.199425 kernel: acpiphp: Slot [6] registered Mar 25 01:30:22.199443 kernel: acpiphp: Slot [7] registered Mar 25 01:30:22.199460 kernel: acpiphp: Slot [8] registered Mar 25 01:30:22.199478 kernel: acpiphp: Slot [9] registered Mar 25 01:30:22.199495 kernel: acpiphp: Slot [10] registered Mar 25 01:30:22.199513 kernel: acpiphp: Slot [11] registered Mar 25 01:30:22.199530 kernel: acpiphp: Slot [12] registered Mar 25 01:30:22.199548 kernel: acpiphp: Slot [13] registered Mar 25 01:30:22.199570 kernel: acpiphp: Slot [14] registered Mar 25 01:30:22.199587 kernel: acpiphp: Slot [15] registered Mar 25 01:30:22.199605 kernel: acpiphp: Slot [16] registered Mar 25 01:30:22.199622 kernel: acpiphp: Slot [17] registered Mar 25 01:30:22.199640 kernel: acpiphp: Slot [18] registered Mar 25 01:30:22.199657 kernel: acpiphp: Slot [19] registered Mar 25 01:30:22.199674 kernel: acpiphp: Slot [20] registered Mar 25 01:30:22.199692 kernel: acpiphp: Slot [21] registered Mar 25 01:30:22.199709 kernel: acpiphp: Slot [22] registered Mar 25 01:30:22.199731 kernel: acpiphp: Slot [23] registered Mar 25 01:30:22.199748 kernel: acpiphp: Slot [24] registered Mar 25 01:30:22.199766 kernel: acpiphp: Slot [25] registered Mar 25 01:30:22.199783 kernel: acpiphp: Slot [26] registered Mar 25 01:30:22.199801 kernel: acpiphp: Slot [27] registered Mar 25 01:30:22.199818 kernel: acpiphp: Slot [28] registered Mar 25 01:30:22.199835 kernel: acpiphp: Slot [29] registered Mar 25 01:30:22.199853 kernel: acpiphp: Slot [30] registered Mar 25 01:30:22.199870 kernel: acpiphp: Slot [31] registered Mar 25 01:30:22.199887 kernel: PCI host bridge to bus 0000:00 Mar 25 01:30:22.200134 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Mar 25 01:30:22.200415 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 25 01:30:22.200599 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Mar 25 01:30:22.200783 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Mar 25 01:30:22.201031 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Mar 25 01:30:22.203392 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Mar 25 01:30:22.203672 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Mar 25 01:30:22.203904 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Mar 25 01:30:22.204126 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Mar 25 01:30:22.204411 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 25 01:30:22.204641 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Mar 25 01:30:22.204848 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Mar 25 01:30:22.205051 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Mar 25 01:30:22.207001 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Mar 25 01:30:22.207274 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 25 01:30:22.207497 kernel: pci 0000:00:05.0: BAR 2: assigned [mem 0x80000000-0x800fffff pref] Mar 25 01:30:22.207711 kernel: pci 0000:00:05.0: BAR 4: assigned [mem 0x80100000-0x8010ffff] Mar 25 01:30:22.207925 kernel: pci 0000:00:04.0: BAR 0: assigned [mem 0x80110000-0x80113fff] Mar 25 01:30:22.208146 kernel: pci 0000:00:05.0: BAR 0: assigned [mem 0x80114000-0x80117fff] Mar 25 01:30:22.208396 kernel: pci 0000:00:01.0: BAR 0: assigned [mem 0x80118000-0x80118fff] Mar 25 01:30:22.208610 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Mar 25 01:30:22.208799 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 25 01:30:22.209044 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Mar 25 01:30:22.209073 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 25 01:30:22.209093 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 25 01:30:22.209111 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 25 01:30:22.209130 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 25 01:30:22.209148 kernel: iommu: Default domain type: Translated Mar 25 01:30:22.209173 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 25 01:30:22.209269 kernel: efivars: Registered efivars operations Mar 25 01:30:22.209288 kernel: vgaarb: loaded Mar 25 01:30:22.209306 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 25 01:30:22.209325 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:30:22.209343 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:30:22.209362 kernel: pnp: PnP ACPI init Mar 25 01:30:22.209594 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Mar 25 01:30:22.209627 kernel: pnp: PnP ACPI: found 1 devices Mar 25 01:30:22.209646 kernel: NET: Registered PF_INET protocol family Mar 25 01:30:22.209665 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 01:30:22.209683 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 25 01:30:22.209701 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:30:22.209718 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:30:22.209737 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 25 01:30:22.209754 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 25 01:30:22.209772 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:30:22.209795 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:30:22.209813 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:30:22.209830 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:30:22.209848 kernel: kvm [1]: HYP mode not available Mar 25 01:30:22.209866 kernel: Initialise system trusted keyrings Mar 25 01:30:22.209885 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 25 01:30:22.209903 kernel: Key type asymmetric registered Mar 25 01:30:22.209920 kernel: Asymmetric key parser 'x509' registered Mar 25 01:30:22.209937 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 25 01:30:22.209959 kernel: io scheduler mq-deadline registered Mar 25 01:30:22.209978 kernel: io scheduler kyber registered Mar 25 01:30:22.209995 kernel: io scheduler bfq registered Mar 25 01:30:22.210251 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Mar 25 01:30:22.210279 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 25 01:30:22.210298 kernel: ACPI: button: Power Button [PWRB] Mar 25 01:30:22.210317 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Mar 25 01:30:22.210334 kernel: ACPI: button: Sleep Button [SLPB] Mar 25 01:30:22.210360 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:30:22.210379 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 25 01:30:22.210593 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Mar 25 01:30:22.210618 kernel: printk: console [ttyS0] disabled Mar 25 01:30:22.210637 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Mar 25 01:30:22.210655 kernel: printk: console [ttyS0] enabled Mar 25 01:30:22.210673 kernel: printk: bootconsole [uart0] disabled Mar 25 01:30:22.210691 kernel: thunder_xcv, ver 1.0 Mar 25 01:30:22.210708 kernel: thunder_bgx, ver 1.0 Mar 25 01:30:22.210727 kernel: nicpf, ver 1.0 Mar 25 01:30:22.210750 kernel: nicvf, ver 1.0 Mar 25 01:30:22.210968 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 25 01:30:22.211232 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-25T01:30:21 UTC (1742866221) Mar 25 01:30:22.211269 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:30:22.211294 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Mar 25 01:30:22.211312 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 25 01:30:22.211331 kernel: watchdog: Hard watchdog permanently disabled Mar 25 01:30:22.211357 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:30:22.211375 kernel: Segment Routing with IPv6 Mar 25 01:30:22.211393 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:30:22.211411 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:30:22.211429 kernel: Key type dns_resolver registered Mar 25 01:30:22.211446 kernel: registered taskstats version 1 Mar 25 01:30:22.211464 kernel: Loading compiled-in X.509 certificates Mar 25 01:30:22.211481 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: ed4ababe871f0afac8b4236504477de11a6baf07' Mar 25 01:30:22.211499 kernel: Key type .fscrypt registered Mar 25 01:30:22.211516 kernel: Key type fscrypt-provisioning registered Mar 25 01:30:22.211537 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:30:22.211555 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:30:22.211572 kernel: ima: No architecture policies found Mar 25 01:30:22.211590 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 25 01:30:22.211608 kernel: clk: Disabling unused clocks Mar 25 01:30:22.211626 kernel: Freeing unused kernel memory: 38464K Mar 25 01:30:22.211643 kernel: Run /init as init process Mar 25 01:30:22.211660 kernel: with arguments: Mar 25 01:30:22.211678 kernel: /init Mar 25 01:30:22.211699 kernel: with environment: Mar 25 01:30:22.211716 kernel: HOME=/ Mar 25 01:30:22.211734 kernel: TERM=linux Mar 25 01:30:22.211751 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:30:22.211770 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:30:22.211794 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:30:22.211814 systemd[1]: Detected virtualization amazon. Mar 25 01:30:22.211837 systemd[1]: Detected architecture arm64. Mar 25 01:30:22.211856 systemd[1]: Running in initrd. Mar 25 01:30:22.211874 systemd[1]: No hostname configured, using default hostname. Mar 25 01:30:22.211894 systemd[1]: Hostname set to . Mar 25 01:30:22.211913 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:30:22.211932 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:30:22.211951 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:30:22.211970 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:30:22.211990 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:30:22.212014 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:30:22.212033 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:30:22.212054 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:30:22.212075 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:30:22.212094 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:30:22.212114 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:30:22.212138 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:30:22.212157 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:30:22.212408 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:30:22.212435 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:30:22.212455 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:30:22.212475 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:30:22.212495 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:30:22.212514 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:30:22.212534 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:30:22.212561 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:30:22.212581 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:30:22.212601 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:30:22.212620 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:30:22.212640 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:30:22.212659 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:30:22.212679 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:30:22.212698 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:30:22.212722 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:30:22.212742 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:30:22.212761 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:30:22.212780 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:30:22.212799 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:30:22.212819 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:30:22.212843 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:30:22.212862 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:30:22.212882 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:30:22.212951 systemd-journald[250]: Collecting audit messages is disabled. Mar 25 01:30:22.212999 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:30:22.213019 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:30:22.213038 systemd-journald[250]: Journal started Mar 25 01:30:22.213075 systemd-journald[250]: Runtime Journal (/run/log/journal/ec2ae8f8800bd6b76ad03686abb1f6b4) is 8M, max 75.3M, 67.3M free. Mar 25 01:30:22.172312 systemd-modules-load[252]: Inserted module 'overlay' Mar 25 01:30:22.218754 kernel: Bridge firewalling registered Mar 25 01:30:22.218799 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:30:22.214758 systemd-modules-load[252]: Inserted module 'br_netfilter' Mar 25 01:30:22.234438 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:30:22.236681 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:30:22.247319 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:30:22.258469 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:30:22.264462 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:30:22.302772 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:30:22.308830 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:30:22.312983 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:30:22.319400 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:30:22.339432 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:30:22.368942 dracut-cmdline[287]: dracut-dracut-053 Mar 25 01:30:22.377247 dracut-cmdline[287]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:30:22.436124 systemd-resolved[288]: Positive Trust Anchors: Mar 25 01:30:22.436869 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:30:22.436933 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:30:22.561356 kernel: SCSI subsystem initialized Mar 25 01:30:22.569310 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:30:22.581304 kernel: iscsi: registered transport (tcp) Mar 25 01:30:22.603900 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:30:22.603972 kernel: QLogic iSCSI HBA Driver Mar 25 01:30:22.676463 kernel: random: crng init done Mar 25 01:30:22.676862 systemd-resolved[288]: Defaulting to hostname 'linux'. Mar 25 01:30:22.680356 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:30:22.684778 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:30:22.712257 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:30:22.717458 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:30:22.759333 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:30:22.759408 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:30:22.761132 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:30:22.826220 kernel: raid6: neonx8 gen() 6502 MB/s Mar 25 01:30:22.843209 kernel: raid6: neonx4 gen() 6481 MB/s Mar 25 01:30:22.860209 kernel: raid6: neonx2 gen() 5389 MB/s Mar 25 01:30:22.877208 kernel: raid6: neonx1 gen() 3924 MB/s Mar 25 01:30:22.894210 kernel: raid6: int64x8 gen() 3625 MB/s Mar 25 01:30:22.911212 kernel: raid6: int64x4 gen() 3706 MB/s Mar 25 01:30:22.928216 kernel: raid6: int64x2 gen() 3603 MB/s Mar 25 01:30:22.945975 kernel: raid6: int64x1 gen() 2767 MB/s Mar 25 01:30:22.946008 kernel: raid6: using algorithm neonx8 gen() 6502 MB/s Mar 25 01:30:22.963976 kernel: raid6: .... xor() 4763 MB/s, rmw enabled Mar 25 01:30:22.964034 kernel: raid6: using neon recovery algorithm Mar 25 01:30:22.971216 kernel: xor: measuring software checksum speed Mar 25 01:30:22.973289 kernel: 8regs : 11880 MB/sec Mar 25 01:30:22.973322 kernel: 32regs : 13011 MB/sec Mar 25 01:30:22.974449 kernel: arm64_neon : 9325 MB/sec Mar 25 01:30:22.974488 kernel: xor: using function: 32regs (13011 MB/sec) Mar 25 01:30:23.061247 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:30:23.087152 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:30:23.093727 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:30:23.144684 systemd-udevd[472]: Using default interface naming scheme 'v255'. Mar 25 01:30:23.155826 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:30:23.162959 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:30:23.208045 dracut-pre-trigger[475]: rd.md=0: removing MD RAID activation Mar 25 01:30:23.264101 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:30:23.270515 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:30:23.399642 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:30:23.408465 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:30:23.456963 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:30:23.462443 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:30:23.464858 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:30:23.467121 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:30:23.483899 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:30:23.536676 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:30:23.600903 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 25 01:30:23.600968 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Mar 25 01:30:23.636729 kernel: ena 0000:00:05.0: ENA device version: 0.10 Mar 25 01:30:23.636986 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Mar 25 01:30:23.637287 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:d7:30:28:e5:13 Mar 25 01:30:23.642173 (udev-worker)[545]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:30:23.650754 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:30:23.650995 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:30:23.656336 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:30:23.660881 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:30:23.692035 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 25 01:30:23.692088 kernel: nvme nvme0: pci function 0000:00:04.0 Mar 25 01:30:23.661212 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:30:23.666172 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:30:23.708423 kernel: nvme nvme0: 2/0/0 default/read/poll queues Mar 25 01:30:23.698995 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:30:23.707582 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:30:23.726282 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:30:23.726356 kernel: GPT:9289727 != 16777215 Mar 25 01:30:23.726381 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:30:23.728168 kernel: GPT:9289727 != 16777215 Mar 25 01:30:23.728217 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:30:23.729295 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 25 01:30:23.750229 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:30:23.756611 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:30:23.807365 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:30:23.936680 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Mar 25 01:30:23.970219 kernel: BTRFS: device fsid bf348154-9cb1-474d-801c-0e035a5758cf devid 1 transid 39 /dev/nvme0n1p3 scanned by (udev-worker) (545) Mar 25 01:30:24.017543 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Mar 25 01:30:24.019982 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Mar 25 01:30:24.026742 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:30:24.055535 disk-uuid[628]: Primary Header is updated. Mar 25 01:30:24.055535 disk-uuid[628]: Secondary Entries is updated. Mar 25 01:30:24.055535 disk-uuid[628]: Secondary Header is updated. Mar 25 01:30:24.071097 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 scanned by (udev-worker) (516) Mar 25 01:30:24.138496 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 25 01:30:24.165490 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Mar 25 01:30:25.075708 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 25 01:30:25.082741 disk-uuid[629]: The operation has completed successfully. Mar 25 01:30:25.283669 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:30:25.283927 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:30:25.391341 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:30:25.415004 sh[923]: Success Mar 25 01:30:25.433228 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 25 01:30:25.543132 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:30:25.549333 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:30:25.565361 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:30:25.594227 kernel: BTRFS info (device dm-0): first mount of filesystem bf348154-9cb1-474d-801c-0e035a5758cf Mar 25 01:30:25.594302 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:30:25.596085 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:30:25.596154 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:30:25.598468 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:30:25.902232 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 25 01:30:25.915299 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:30:25.919137 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:30:25.923885 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:30:25.931452 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:30:25.984240 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:30:25.984313 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:30:25.984349 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 25 01:30:25.992878 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 25 01:30:26.000231 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:30:26.006232 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:30:26.013333 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:30:26.101249 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:30:26.107336 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:30:26.170933 systemd-networkd[1112]: lo: Link UP Mar 25 01:30:26.170953 systemd-networkd[1112]: lo: Gained carrier Mar 25 01:30:26.176116 systemd-networkd[1112]: Enumeration completed Mar 25 01:30:26.177489 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:30:26.178136 systemd-networkd[1112]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:30:26.178152 systemd-networkd[1112]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:30:26.182321 systemd[1]: Reached target network.target - Network. Mar 25 01:30:26.192857 systemd-networkd[1112]: eth0: Link UP Mar 25 01:30:26.192876 systemd-networkd[1112]: eth0: Gained carrier Mar 25 01:30:26.192894 systemd-networkd[1112]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:30:26.215263 systemd-networkd[1112]: eth0: DHCPv4 address 172.31.28.242/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 25 01:30:26.523578 ignition[1041]: Ignition 2.20.0 Mar 25 01:30:26.523600 ignition[1041]: Stage: fetch-offline Mar 25 01:30:26.524144 ignition[1041]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:30:26.524169 ignition[1041]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:30:26.531462 ignition[1041]: Ignition finished successfully Mar 25 01:30:26.535797 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:30:26.542478 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:30:26.579661 ignition[1124]: Ignition 2.20.0 Mar 25 01:30:26.579693 ignition[1124]: Stage: fetch Mar 25 01:30:26.581274 ignition[1124]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:30:26.581300 ignition[1124]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:30:26.581771 ignition[1124]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:30:26.592479 ignition[1124]: PUT result: OK Mar 25 01:30:26.595395 ignition[1124]: parsed url from cmdline: "" Mar 25 01:30:26.595528 ignition[1124]: no config URL provided Mar 25 01:30:26.595698 ignition[1124]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:30:26.595725 ignition[1124]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:30:26.595761 ignition[1124]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:30:26.599297 ignition[1124]: PUT result: OK Mar 25 01:30:26.599379 ignition[1124]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Mar 25 01:30:26.610836 unknown[1124]: fetched base config from "system" Mar 25 01:30:26.601550 ignition[1124]: GET result: OK Mar 25 01:30:26.610853 unknown[1124]: fetched base config from "system" Mar 25 01:30:26.601697 ignition[1124]: parsing config with SHA512: 58ac1a7550d6f490a11f0cd1b2f39943f0d1b4f5d4b982e05580463b1cf5ea8c2e92b79d75d6d447f089e8935602ccb5cae2dad0bfadcbbdc6befde08da340a3 Mar 25 01:30:26.610866 unknown[1124]: fetched user config from "aws" Mar 25 01:30:26.611790 ignition[1124]: fetch: fetch complete Mar 25 01:30:26.620222 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:30:26.611803 ignition[1124]: fetch: fetch passed Mar 25 01:30:26.611891 ignition[1124]: Ignition finished successfully Mar 25 01:30:26.631410 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:30:26.675648 ignition[1130]: Ignition 2.20.0 Mar 25 01:30:26.675681 ignition[1130]: Stage: kargs Mar 25 01:30:26.676883 ignition[1130]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:30:26.676920 ignition[1130]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:30:26.677102 ignition[1130]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:30:26.679328 ignition[1130]: PUT result: OK Mar 25 01:30:26.689558 ignition[1130]: kargs: kargs passed Mar 25 01:30:26.689708 ignition[1130]: Ignition finished successfully Mar 25 01:30:26.696278 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:30:26.701285 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:30:26.735450 ignition[1137]: Ignition 2.20.0 Mar 25 01:30:26.735471 ignition[1137]: Stage: disks Mar 25 01:30:26.735997 ignition[1137]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:30:26.736021 ignition[1137]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:30:26.736161 ignition[1137]: PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:30:26.740950 ignition[1137]: PUT result: OK Mar 25 01:30:26.748683 ignition[1137]: disks: disks passed Mar 25 01:30:26.748780 ignition[1137]: Ignition finished successfully Mar 25 01:30:26.753150 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:30:26.757385 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:30:26.760801 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:30:26.763893 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:30:26.765719 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:30:26.767548 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:30:26.773328 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:30:26.970094 systemd-fsck[1146]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 25 01:30:26.974396 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:30:26.981386 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:30:27.071263 kernel: EXT4-fs (nvme0n1p9): mounted filesystem a7a89271-ee7d-4bda-a834-705261d6cda9 r/w with ordered data mode. Quota mode: none. Mar 25 01:30:27.073822 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:30:27.077976 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:30:27.130998 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:30:27.154439 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:30:27.158632 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 25 01:30:27.162803 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:30:27.168148 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:30:27.182550 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:30:27.189483 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:30:27.200195 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 scanned by mount (1165) Mar 25 01:30:27.204561 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:30:27.204624 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:30:27.205798 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 25 01:30:27.313402 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 25 01:30:27.315074 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:30:27.811386 initrd-setup-root[1189]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:30:27.821901 initrd-setup-root[1196]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:30:27.825245 systemd-networkd[1112]: eth0: Gained IPv6LL Mar 25 01:30:27.854002 initrd-setup-root[1203]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:30:27.863501 initrd-setup-root[1210]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:30:28.422911 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:30:28.430369 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:30:28.447429 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:30:28.461577 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:30:28.463965 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:30:28.510851 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:30:28.516237 ignition[1278]: INFO : Ignition 2.20.0 Mar 25 01:30:28.516237 ignition[1278]: INFO : Stage: mount Mar 25 01:30:28.519651 ignition[1278]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:30:28.519651 ignition[1278]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:30:28.519651 ignition[1278]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:30:28.526194 ignition[1278]: INFO : PUT result: OK Mar 25 01:30:28.532573 ignition[1278]: INFO : mount: mount passed Mar 25 01:30:28.534581 ignition[1278]: INFO : Ignition finished successfully Mar 25 01:30:28.538665 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:30:28.545694 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:30:28.580368 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:30:28.619216 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/nvme0n1p6 scanned by mount (1289) Mar 25 01:30:28.623869 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:30:28.623916 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:30:28.623942 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 25 01:30:28.629208 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 25 01:30:28.632779 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:30:28.674997 ignition[1306]: INFO : Ignition 2.20.0 Mar 25 01:30:28.674997 ignition[1306]: INFO : Stage: files Mar 25 01:30:28.678556 ignition[1306]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:30:28.678556 ignition[1306]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:30:28.678556 ignition[1306]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:30:28.685218 ignition[1306]: INFO : PUT result: OK Mar 25 01:30:28.689309 ignition[1306]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:30:28.707367 ignition[1306]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:30:28.707367 ignition[1306]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:30:28.782056 ignition[1306]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:30:28.784593 ignition[1306]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:30:28.787081 unknown[1306]: wrote ssh authorized keys file for user: core Mar 25 01:30:28.789306 ignition[1306]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:30:28.792044 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Mar 25 01:30:28.795507 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Mar 25 01:30:28.879982 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:30:29.066155 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Mar 25 01:30:29.066155 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:30:29.075744 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-arm64.raw: attempt #1 Mar 25 01:30:29.555761 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:30:30.007676 ignition[1306]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:30:30.007676 ignition[1306]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:30:30.016384 ignition[1306]: INFO : files: files passed Mar 25 01:30:30.016384 ignition[1306]: INFO : Ignition finished successfully Mar 25 01:30:30.028515 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:30:30.045883 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:30:30.051445 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:30:30.078155 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:30:30.079043 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:30:30.096284 initrd-setup-root-after-ignition[1335]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:30:30.096284 initrd-setup-root-after-ignition[1335]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:30:30.103882 initrd-setup-root-after-ignition[1339]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:30:30.109490 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:30:30.116112 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:30:30.120555 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:30:30.196918 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:30:30.197816 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:30:30.201808 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:30:30.205073 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:30:30.208753 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:30:30.210378 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:30:30.250472 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:30:30.259351 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:30:30.300639 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:30:30.305124 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:30:30.309764 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:30:30.315003 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:30:30.315319 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:30:30.322014 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:30:30.325740 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:30:30.329121 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:30:30.335395 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:30:30.339557 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:30:30.343552 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:30:30.347566 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:30:30.349908 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:30:30.351950 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:30:30.359489 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:30:30.361206 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:30:30.361447 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:30:30.368633 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:30:30.370739 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:30:30.373682 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:30:30.374632 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:30:30.378003 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:30:30.378266 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:30:30.384707 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:30:30.384944 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:30:30.385320 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:30:30.385515 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:30:30.391585 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:30:30.404326 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:30:30.406699 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:30:30.417806 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:30:30.427499 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:30:30.428323 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:30:30.445900 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:30:30.446325 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:30:30.474819 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:30:30.480798 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:30:30.481324 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:30:30.491258 ignition[1359]: INFO : Ignition 2.20.0 Mar 25 01:30:30.491258 ignition[1359]: INFO : Stage: umount Mar 25 01:30:30.495675 ignition[1359]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:30:30.495675 ignition[1359]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 25 01:30:30.495675 ignition[1359]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 25 01:30:30.503066 ignition[1359]: INFO : PUT result: OK Mar 25 01:30:30.507111 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:30:30.510443 ignition[1359]: INFO : umount: umount passed Mar 25 01:30:30.510443 ignition[1359]: INFO : Ignition finished successfully Mar 25 01:30:30.510324 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:30:30.519431 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:30:30.519859 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:30:30.527307 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:30:30.527407 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:30:30.529790 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:30:30.529873 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:30:30.534855 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:30:30.534948 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:30:30.537603 systemd[1]: Stopped target network.target - Network. Mar 25 01:30:30.549065 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:30:30.549214 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:30:30.555061 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:30:30.556674 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:30:30.558245 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:30:30.561290 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:30:30.562864 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:30:30.564564 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:30:30.564645 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:30:30.566399 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:30:30.566469 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:30:30.568271 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:30:30.568359 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:30:30.570139 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:30:30.570244 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:30:30.572791 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:30:30.572869 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:30:30.577804 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:30:30.581736 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:30:30.597824 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:30:30.598070 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:30:30.623805 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:30:30.624600 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:30:30.625145 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:30:30.633895 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:30:30.635794 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:30:30.636301 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:30:30.645432 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:30:30.647691 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:30:30.648092 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:30:30.655977 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:30:30.656086 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:30:30.662644 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:30:30.662753 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:30:30.672894 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:30:30.673009 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:30:30.677311 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:30:30.687534 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:30:30.687687 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:30:30.704629 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:30:30.708284 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:30:30.714260 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:30:30.714384 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:30:30.720322 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:30:30.720418 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:30:30.722571 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:30:30.724417 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:30:30.731852 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:30:30.731973 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:30:30.734200 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:30:30.734305 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:30:30.747544 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:30:30.752308 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:30:30.752454 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:30:30.756028 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:30:30.756144 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:30:30.773035 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 01:30:30.773253 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:30:30.778280 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:30:30.778684 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:30:30.797465 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:30:30.799426 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:30:30.806094 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:30:30.809993 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:30:30.838727 systemd[1]: Switching root. Mar 25 01:30:30.921060 systemd-journald[250]: Journal stopped Mar 25 01:30:33.850113 systemd-journald[250]: Received SIGTERM from PID 1 (systemd). Mar 25 01:30:33.854299 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:30:33.854353 kernel: SELinux: policy capability open_perms=1 Mar 25 01:30:33.854385 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:30:33.854415 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:30:33.854446 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:30:33.854485 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:30:33.854515 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:30:33.854545 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:30:33.854584 kernel: audit: type=1403 audit(1742866231.493:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:30:33.854617 systemd[1]: Successfully loaded SELinux policy in 90.187ms. Mar 25 01:30:33.854669 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 24.701ms. Mar 25 01:30:33.854702 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:30:33.854734 systemd[1]: Detected virtualization amazon. Mar 25 01:30:33.854765 systemd[1]: Detected architecture arm64. Mar 25 01:30:33.854800 systemd[1]: Detected first boot. Mar 25 01:30:33.854830 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:30:33.854860 zram_generator::config[1404]: No configuration found. Mar 25 01:30:33.854892 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:30:33.854920 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:30:33.854951 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:30:33.854983 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:30:33.855015 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:30:33.855051 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:30:33.855083 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:30:33.855112 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:30:33.855147 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:30:33.855200 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:30:33.855263 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:30:33.855301 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:30:33.855335 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:30:33.855367 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:30:33.855409 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:30:33.855440 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:30:33.855470 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:30:33.855501 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:30:33.855531 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:30:33.855575 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:30:33.855607 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 25 01:30:33.855644 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:30:33.855674 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:30:33.855709 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:30:33.855742 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:30:33.855771 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:30:33.855800 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:30:33.855836 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:30:33.855866 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:30:33.855900 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:30:33.855934 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:30:33.855974 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:30:33.856007 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:30:33.856038 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:30:33.856069 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:30:33.856098 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:30:33.856130 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:30:33.856160 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:30:33.866505 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:30:33.866596 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:30:33.866628 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:30:33.866662 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:30:33.866691 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:30:33.866724 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:30:33.866753 systemd[1]: Reached target machines.target - Containers. Mar 25 01:30:33.866804 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:30:33.866839 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:30:33.866910 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:30:33.866984 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:30:33.867019 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:30:33.867048 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:30:33.867080 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:30:33.867109 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:30:33.867137 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:30:33.867166 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:30:33.867215 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:30:33.867253 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:30:33.867282 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:30:33.867310 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:30:33.867342 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:30:33.867373 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:30:33.867401 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:30:33.867429 kernel: loop: module loaded Mar 25 01:30:33.867458 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:30:33.867486 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:30:33.867519 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:30:33.867547 kernel: fuse: init (API version 7.39) Mar 25 01:30:33.867574 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:30:33.867604 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:30:33.867633 systemd[1]: Stopped verity-setup.service. Mar 25 01:30:33.867669 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:30:33.867748 systemd-journald[1483]: Collecting audit messages is disabled. Mar 25 01:30:33.867798 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:30:33.867827 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:30:33.867856 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:30:33.867885 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:30:33.867913 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:30:33.867946 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:30:33.867981 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:30:33.868012 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:30:33.868040 systemd-journald[1483]: Journal started Mar 25 01:30:33.868090 systemd-journald[1483]: Runtime Journal (/run/log/journal/ec2ae8f8800bd6b76ad03686abb1f6b4) is 8M, max 75.3M, 67.3M free. Mar 25 01:30:33.325725 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:30:33.881173 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:30:33.336487 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Mar 25 01:30:33.337384 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:30:33.878557 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:30:33.879055 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:30:33.894418 kernel: ACPI: bus type drm_connector registered Mar 25 01:30:33.890797 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:30:33.892145 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:30:33.897908 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:30:33.899537 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:30:33.905819 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:30:33.907289 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:30:33.913108 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:30:33.913747 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:30:33.916634 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:30:33.925244 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:30:33.928523 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:30:33.960055 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:30:33.967674 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:30:33.973409 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:30:33.975650 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:30:33.975740 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:30:33.982384 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:30:33.991600 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:30:33.999660 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:30:34.002107 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:30:34.019692 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:30:34.024831 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:30:34.027378 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:30:34.032491 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:30:34.034632 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:30:34.044580 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:30:34.053604 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:30:34.062327 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:30:34.065253 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:30:34.067873 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:30:34.070148 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:30:34.086288 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:30:34.096608 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:30:34.121018 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:30:34.123726 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:30:34.131615 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:30:34.141276 systemd-journald[1483]: Time spent on flushing to /var/log/journal/ec2ae8f8800bd6b76ad03686abb1f6b4 is 68.503ms for 919 entries. Mar 25 01:30:34.141276 systemd-journald[1483]: System Journal (/var/log/journal/ec2ae8f8800bd6b76ad03686abb1f6b4) is 8M, max 195.6M, 187.6M free. Mar 25 01:30:34.221110 systemd-journald[1483]: Received client request to flush runtime journal. Mar 25 01:30:34.223265 kernel: loop0: detected capacity change from 0 to 201592 Mar 25 01:30:34.223844 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:30:34.232850 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:30:34.246224 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:30:34.252818 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:30:34.290287 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:30:34.298953 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:30:34.301164 kernel: loop1: detected capacity change from 0 to 126448 Mar 25 01:30:34.341550 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:30:34.360996 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:30:34.369014 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:30:34.390477 systemd-tmpfiles[1557]: ACLs are not supported, ignoring. Mar 25 01:30:34.390518 systemd-tmpfiles[1557]: ACLs are not supported, ignoring. Mar 25 01:30:34.410314 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:30:34.422653 udevadm[1560]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 25 01:30:34.578244 kernel: loop2: detected capacity change from 0 to 54976 Mar 25 01:30:34.662775 kernel: loop3: detected capacity change from 0 to 103832 Mar 25 01:30:34.832231 kernel: loop4: detected capacity change from 0 to 201592 Mar 25 01:30:34.860235 kernel: loop5: detected capacity change from 0 to 126448 Mar 25 01:30:34.875222 kernel: loop6: detected capacity change from 0 to 54976 Mar 25 01:30:34.895243 kernel: loop7: detected capacity change from 0 to 103832 Mar 25 01:30:34.903638 (sd-merge)[1565]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Mar 25 01:30:34.904679 (sd-merge)[1565]: Merged extensions into '/usr'. Mar 25 01:30:34.912427 systemd[1]: Reload requested from client PID 1537 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:30:34.912460 systemd[1]: Reloading... Mar 25 01:30:35.156210 zram_generator::config[1591]: No configuration found. Mar 25 01:30:35.461475 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:30:35.610722 systemd[1]: Reloading finished in 697 ms. Mar 25 01:30:35.635844 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:30:35.638898 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:30:35.658325 systemd[1]: Starting ensure-sysext.service... Mar 25 01:30:35.664482 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:30:35.676929 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:30:35.713568 systemd[1]: Reload requested from client PID 1645 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:30:35.713606 systemd[1]: Reloading... Mar 25 01:30:35.733398 systemd-tmpfiles[1646]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:30:35.734636 systemd-tmpfiles[1646]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:30:35.737116 systemd-tmpfiles[1646]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:30:35.738092 systemd-tmpfiles[1646]: ACLs are not supported, ignoring. Mar 25 01:30:35.738530 systemd-tmpfiles[1646]: ACLs are not supported, ignoring. Mar 25 01:30:35.749886 systemd-tmpfiles[1646]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:30:35.750394 systemd-tmpfiles[1646]: Skipping /boot Mar 25 01:30:35.781024 systemd-tmpfiles[1646]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:30:35.781052 systemd-tmpfiles[1646]: Skipping /boot Mar 25 01:30:35.861090 systemd-udevd[1647]: Using default interface naming scheme 'v255'. Mar 25 01:30:35.889218 zram_generator::config[1677]: No configuration found. Mar 25 01:30:36.216138 (udev-worker)[1728]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:30:36.329441 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:30:36.538518 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 25 01:30:36.539336 systemd[1]: Reloading finished in 824 ms. Mar 25 01:30:36.556049 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:30:36.581498 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:30:36.618230 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (1728) Mar 25 01:30:36.639292 systemd[1]: Finished ensure-sysext.service. Mar 25 01:30:36.655627 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:30:36.662011 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:30:36.664598 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:30:36.670853 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:30:36.680591 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:30:36.692127 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:30:36.701653 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:30:36.706609 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:30:36.706696 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:30:36.712664 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:30:36.729157 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:30:36.743644 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:30:36.748532 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:30:36.759584 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:30:36.769577 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:30:36.798596 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:30:36.803961 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:30:36.806279 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:30:36.889309 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:30:36.891628 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:30:36.901245 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:30:36.907964 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:30:36.909883 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:30:36.911982 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:30:36.977858 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:30:36.988027 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:30:36.991667 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:30:36.992989 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:30:37.068941 augenrules[1883]: No rules Mar 25 01:30:37.073772 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:30:37.076375 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:30:37.170263 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:30:37.176608 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:30:37.186707 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:30:37.211379 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:30:37.234635 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Mar 25 01:30:37.250785 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:30:37.262494 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:30:37.268665 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:30:37.346277 lvm[1899]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:30:37.368215 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:30:37.389293 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:30:37.392766 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:30:37.401520 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:30:37.444563 systemd-resolved[1798]: Positive Trust Anchors: Mar 25 01:30:37.444601 systemd-resolved[1798]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:30:37.444664 systemd-resolved[1798]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:30:37.449076 lvm[1906]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:30:37.454853 systemd-resolved[1798]: Defaulting to hostname 'linux'. Mar 25 01:30:37.458364 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:30:37.461413 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:30:37.486156 systemd-networkd[1793]: lo: Link UP Mar 25 01:30:37.486802 systemd-networkd[1793]: lo: Gained carrier Mar 25 01:30:37.490765 systemd-networkd[1793]: Enumeration completed Mar 25 01:30:37.490938 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:30:37.493311 systemd[1]: Reached target network.target - Network. Mar 25 01:30:37.497815 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:30:37.509732 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:30:37.512062 systemd-networkd[1793]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:30:37.512083 systemd-networkd[1793]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:30:37.515307 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:30:37.522290 systemd-networkd[1793]: eth0: Link UP Mar 25 01:30:37.522616 systemd-networkd[1793]: eth0: Gained carrier Mar 25 01:30:37.522652 systemd-networkd[1793]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:30:37.532330 systemd-networkd[1793]: eth0: DHCPv4 address 172.31.28.242/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 25 01:30:37.555965 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:30:38.174595 ldconfig[1532]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:30:38.180077 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:30:38.186486 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:30:38.219563 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:30:38.222924 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:30:38.225450 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:30:38.227954 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:30:38.230557 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:30:38.232973 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:30:38.235354 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:30:38.237634 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:30:38.237794 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:30:38.239693 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:30:38.242354 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:30:38.247291 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:30:38.254561 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:30:38.258153 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:30:38.260815 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:30:38.272904 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:30:38.276593 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:30:38.280431 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:30:38.282875 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:30:38.284842 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:30:38.286781 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:30:38.286853 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:30:38.288929 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:30:38.293843 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:30:38.300741 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:30:38.310940 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:30:38.317226 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:30:38.319313 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:30:38.330110 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:30:38.337655 systemd[1]: Started ntpd.service - Network Time Service. Mar 25 01:30:38.345452 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:30:38.350973 systemd[1]: Starting setup-oem.service - Setup OEM... Mar 25 01:30:38.364831 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:30:38.377954 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:30:38.386943 jq[1920]: false Mar 25 01:30:38.395645 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:30:38.401026 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:30:38.405380 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:30:38.412561 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:30:38.421955 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:30:38.426684 dbus-daemon[1919]: [system] SELinux support is enabled Mar 25 01:30:38.432238 dbus-daemon[1919]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1793 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 25 01:30:38.440372 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:30:38.447966 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:30:38.451420 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:30:38.512863 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:30:38.515290 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:30:38.527884 dbus-daemon[1919]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 25 01:30:38.533112 tar[1935]: linux-arm64/LICENSE Mar 25 01:30:38.533112 tar[1935]: linux-arm64/helm Mar 25 01:30:38.530793 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:30:38.533479 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:30:38.533521 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:30:38.536405 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:30:38.536458 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:30:38.550899 jq[1931]: true Mar 25 01:30:38.559578 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 25 01:30:38.580002 (ntainerd)[1946]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:30:38.592407 extend-filesystems[1921]: Found loop4 Mar 25 01:30:38.592407 extend-filesystems[1921]: Found loop5 Mar 25 01:30:38.592407 extend-filesystems[1921]: Found loop6 Mar 25 01:30:38.604782 extend-filesystems[1921]: Found loop7 Mar 25 01:30:38.604782 extend-filesystems[1921]: Found nvme0n1 Mar 25 01:30:38.604782 extend-filesystems[1921]: Found nvme0n1p1 Mar 25 01:30:38.604782 extend-filesystems[1921]: Found nvme0n1p2 Mar 25 01:30:38.604782 extend-filesystems[1921]: Found nvme0n1p3 Mar 25 01:30:38.612902 extend-filesystems[1921]: Found usr Mar 25 01:30:38.612902 extend-filesystems[1921]: Found nvme0n1p4 Mar 25 01:30:38.612902 extend-filesystems[1921]: Found nvme0n1p6 Mar 25 01:30:38.612902 extend-filesystems[1921]: Found nvme0n1p7 Mar 25 01:30:38.612902 extend-filesystems[1921]: Found nvme0n1p9 Mar 25 01:30:38.612902 extend-filesystems[1921]: Checking size of /dev/nvme0n1p9 Mar 25 01:30:38.618430 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:30:38.618960 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:30:38.673614 jq[1954]: true Mar 25 01:30:38.692133 systemd[1]: Finished setup-oem.service - Setup OEM. Mar 25 01:30:38.797315 ntpd[1923]: ntpd 4.2.8p17@1.4004-o Mon Mar 24 23:09:33 UTC 2025 (1): Starting Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: ntpd 4.2.8p17@1.4004-o Mon Mar 24 23:09:33 UTC 2025 (1): Starting Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: ---------------------------------------------------- Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: ntp-4 is maintained by Network Time Foundation, Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: corporation. Support and training for ntp-4 are Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: available at https://www.nwtime.org/support Mar 25 01:30:38.799541 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: ---------------------------------------------------- Mar 25 01:30:38.797377 ntpd[1923]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 25 01:30:38.797397 ntpd[1923]: ---------------------------------------------------- Mar 25 01:30:38.797417 ntpd[1923]: ntp-4 is maintained by Network Time Foundation, Mar 25 01:30:38.797435 ntpd[1923]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 25 01:30:38.797454 ntpd[1923]: corporation. Support and training for ntp-4 are Mar 25 01:30:38.797471 ntpd[1923]: available at https://www.nwtime.org/support Mar 25 01:30:38.797489 ntpd[1923]: ---------------------------------------------------- Mar 25 01:30:38.800888 ntpd[1923]: proto: precision = 0.096 usec (-23) Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: proto: precision = 0.096 usec (-23) Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: basedate set to 2025-03-12 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: gps base set to 2025-03-16 (week 2358) Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Listen and drop on 0 v6wildcard [::]:123 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Listen normally on 2 lo 127.0.0.1:123 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Listen normally on 3 eth0 172.31.28.242:123 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Listen normally on 4 lo [::1]:123 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: bind(21) AF_INET6 fe80::4d7:30ff:fe28:e513%2#123 flags 0x11 failed: Cannot assign requested address Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: unable to create socket on eth0 (5) for fe80::4d7:30ff:fe28:e513%2#123 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: failed to init interface for address fe80::4d7:30ff:fe28:e513%2 Mar 25 01:30:38.814352 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: Listening on routing socket on fd #21 for interface updates Mar 25 01:30:38.803978 ntpd[1923]: basedate set to 2025-03-12 Mar 25 01:30:38.804015 ntpd[1923]: gps base set to 2025-03-16 (week 2358) Mar 25 01:30:38.806692 ntpd[1923]: Listen and drop on 0 v6wildcard [::]:123 Mar 25 01:30:38.806786 ntpd[1923]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 25 01:30:38.807093 ntpd[1923]: Listen normally on 2 lo 127.0.0.1:123 Mar 25 01:30:38.807168 ntpd[1923]: Listen normally on 3 eth0 172.31.28.242:123 Mar 25 01:30:38.809363 ntpd[1923]: Listen normally on 4 lo [::1]:123 Mar 25 01:30:38.809467 ntpd[1923]: bind(21) AF_INET6 fe80::4d7:30ff:fe28:e513%2#123 flags 0x11 failed: Cannot assign requested address Mar 25 01:30:38.809512 ntpd[1923]: unable to create socket on eth0 (5) for fe80::4d7:30ff:fe28:e513%2#123 Mar 25 01:30:38.809540 ntpd[1923]: failed to init interface for address fe80::4d7:30ff:fe28:e513%2 Mar 25 01:30:38.809609 ntpd[1923]: Listening on routing socket on fd #21 for interface updates Mar 25 01:30:38.818248 ntpd[1923]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:30:38.823103 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:30:38.823103 ntpd[1923]: 25 Mar 01:30:38 ntpd[1923]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:30:38.818324 ntpd[1923]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 25 01:30:38.863544 update_engine[1930]: I20250325 01:30:38.862062 1930 main.cc:92] Flatcar Update Engine starting Mar 25 01:30:38.867704 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:30:38.876023 update_engine[1930]: I20250325 01:30:38.873364 1930 update_check_scheduler.cc:74] Next update check in 5m49s Mar 25 01:30:38.890811 systemd-logind[1929]: Watching system buttons on /dev/input/event0 (Power Button) Mar 25 01:30:38.890865 systemd-logind[1929]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 25 01:30:38.891257 systemd-logind[1929]: New seat seat0. Mar 25 01:30:38.921550 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:30:38.923857 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:30:38.966630 extend-filesystems[1921]: Resized partition /dev/nvme0n1p9 Mar 25 01:30:38.987456 extend-filesystems[1987]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:30:39.008218 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Mar 25 01:30:39.028889 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 25 01:30:39.031534 dbus-daemon[1919]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 25 01:30:39.033450 dbus-daemon[1919]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=1953 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 25 01:30:39.043383 systemd[1]: Starting polkit.service - Authorization Manager... Mar 25 01:30:39.103215 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Mar 25 01:30:39.105956 coreos-metadata[1918]: Mar 25 01:30:39.105 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 25 01:30:39.115961 coreos-metadata[1918]: Mar 25 01:30:39.115 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.116 INFO Fetch successful Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.116 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.117 INFO Fetch successful Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.117 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.118 INFO Fetch successful Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.118 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.119 INFO Fetch successful Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.119 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.119 INFO Fetch failed with 404: resource not found Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.119 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.120 INFO Fetch successful Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.120 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.124 INFO Fetch successful Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.124 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.125 INFO Fetch successful Mar 25 01:30:39.127267 coreos-metadata[1918]: Mar 25 01:30:39.125 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Mar 25 01:30:39.128230 extend-filesystems[1987]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Mar 25 01:30:39.128230 extend-filesystems[1987]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 25 01:30:39.128230 extend-filesystems[1987]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Mar 25 01:30:39.161072 coreos-metadata[1918]: Mar 25 01:30:39.130 INFO Fetch successful Mar 25 01:30:39.161072 coreos-metadata[1918]: Mar 25 01:30:39.130 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Mar 25 01:30:39.161072 coreos-metadata[1918]: Mar 25 01:30:39.135 INFO Fetch successful Mar 25 01:30:39.161362 bash[1980]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:30:39.132773 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:30:39.161635 extend-filesystems[1921]: Resized filesystem in /dev/nvme0n1p9 Mar 25 01:30:39.140960 polkitd[1990]: Started polkitd version 121 Mar 25 01:30:39.135406 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:30:39.145309 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:30:39.167382 systemd[1]: Starting sshkeys.service... Mar 25 01:30:39.205937 polkitd[1990]: Loading rules from directory /etc/polkit-1/rules.d Mar 25 01:30:39.206065 polkitd[1990]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 25 01:30:39.215430 systemd-networkd[1793]: eth0: Gained IPv6LL Mar 25 01:30:39.232387 polkitd[1990]: Finished loading, compiling and executing 2 rules Mar 25 01:30:39.242155 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:30:39.249091 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 01:30:39.252973 dbus-daemon[1919]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 25 01:30:39.253664 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:30:39.258408 polkitd[1990]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 25 01:30:39.265113 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Mar 25 01:30:39.273259 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 01:30:39.280872 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:30:39.292870 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:30:39.302061 systemd[1]: Started polkit.service - Authorization Manager. Mar 25 01:30:39.346217 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (1728) Mar 25 01:30:39.392864 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:30:39.397794 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:30:39.499561 systemd-hostnamed[1953]: Hostname set to (transient) Mar 25 01:30:39.502264 systemd-resolved[1798]: System hostname changed to 'ip-172-31-28-242'. Mar 25 01:30:39.528263 coreos-metadata[2013]: Mar 25 01:30:39.527 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 25 01:30:39.536071 coreos-metadata[2013]: Mar 25 01:30:39.529 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Mar 25 01:30:39.536071 coreos-metadata[2013]: Mar 25 01:30:39.533 INFO Fetch successful Mar 25 01:30:39.536071 coreos-metadata[2013]: Mar 25 01:30:39.533 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 25 01:30:39.536071 coreos-metadata[2013]: Mar 25 01:30:39.535 INFO Fetch successful Mar 25 01:30:39.568804 unknown[2013]: wrote ssh authorized keys file for user: core Mar 25 01:30:39.682268 update-ssh-keys[2087]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:30:39.726780 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:30:39.733481 amazon-ssm-agent[2012]: Initializing new seelog logger Mar 25 01:30:39.742621 amazon-ssm-agent[2012]: New Seelog Logger Creation Complete Mar 25 01:30:39.745960 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.745960 amazon-ssm-agent[2012]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.746332 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 01:30:39.758310 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 processing appconfig overrides Mar 25 01:30:39.761478 amazon-ssm-agent[2012]: 2025-03-25 01:30:39 INFO Proxy environment variables: Mar 25 01:30:39.761478 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.761478 amazon-ssm-agent[2012]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.761478 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 processing appconfig overrides Mar 25 01:30:39.765332 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.765332 amazon-ssm-agent[2012]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.765096 systemd[1]: Finished sshkeys.service. Mar 25 01:30:39.768792 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 processing appconfig overrides Mar 25 01:30:39.781617 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.781617 amazon-ssm-agent[2012]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 25 01:30:39.781617 amazon-ssm-agent[2012]: 2025/03/25 01:30:39 processing appconfig overrides Mar 25 01:30:39.860019 amazon-ssm-agent[2012]: 2025-03-25 01:30:39 INFO https_proxy: Mar 25 01:30:39.960309 amazon-ssm-agent[2012]: 2025-03-25 01:30:39 INFO http_proxy: Mar 25 01:30:40.061248 amazon-ssm-agent[2012]: 2025-03-25 01:30:39 INFO no_proxy: Mar 25 01:30:40.125109 locksmithd[1983]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:30:40.160197 amazon-ssm-agent[2012]: 2025-03-25 01:30:39 INFO Checking if agent identity type OnPrem can be assumed Mar 25 01:30:40.252721 sshd_keygen[1940]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:30:40.259024 amazon-ssm-agent[2012]: 2025-03-25 01:30:39 INFO Checking if agent identity type EC2 can be assumed Mar 25 01:30:40.359098 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO Agent will take identity from EC2 Mar 25 01:30:40.381595 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:30:40.391507 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:30:40.402352 systemd[1]: Started sshd@0-172.31.28.242:22-147.75.109.163:43468.service - OpenSSH per-connection server daemon (147.75.109.163:43468). Mar 25 01:30:40.448265 containerd[1946]: time="2025-03-25T01:30:40Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:30:40.453238 containerd[1946]: time="2025-03-25T01:30:40.452347934Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:30:40.460629 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 25 01:30:40.474823 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:30:40.477633 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:30:40.490392 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:30:40.512230 containerd[1946]: time="2025-03-25T01:30:40.512113190Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="13.404µs" Mar 25 01:30:40.512454 containerd[1946]: time="2025-03-25T01:30:40.512414870Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:30:40.513531 containerd[1946]: time="2025-03-25T01:30:40.512543978Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:30:40.513531 containerd[1946]: time="2025-03-25T01:30:40.512930078Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:30:40.513531 containerd[1946]: time="2025-03-25T01:30:40.512978090Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:30:40.513531 containerd[1946]: time="2025-03-25T01:30:40.513073562Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.517276874Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.517335398Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.517836518Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.517886510Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.517918406Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.517944362Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.518150402Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.518603330Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.518673530Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:30:40.519030 containerd[1946]: time="2025-03-25T01:30:40.518698622Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:30:40.521296 containerd[1946]: time="2025-03-25T01:30:40.520979750Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:30:40.522684 containerd[1946]: time="2025-03-25T01:30:40.522631958Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:30:40.522995 containerd[1946]: time="2025-03-25T01:30:40.522958298Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530013158Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530373002Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530418830Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530449994Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530479034Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530507090Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530537222Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530567906Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530596778Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530624270Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530651234Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530680202Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530913878Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:30:40.532361 containerd[1946]: time="2025-03-25T01:30:40.530956118Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531003122Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531039410Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531070802Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531098954Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531126782Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531154466Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531213182Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531245930Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531275114Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531476354Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:30:40.533030 containerd[1946]: time="2025-03-25T01:30:40.531509390Z" level=info msg="Start snapshots syncer" Mar 25 01:30:40.541436 containerd[1946]: time="2025-03-25T01:30:40.535595258Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:30:40.541436 containerd[1946]: time="2025-03-25T01:30:40.539478026Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.539585006Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.539730698Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.539982818Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540028454Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540058478Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540085478Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540118766Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540146366Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540222242Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540287954Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540323570Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540351158Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540418202Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:30:40.541779 containerd[1946]: time="2025-03-25T01:30:40.540452834Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540477002Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540502490Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540524978Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540567494Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540614774Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540671102Z" level=info msg="runtime interface created" Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540686942Z" level=info msg="created NRI interface" Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540707642Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540737474Z" level=info msg="Connect containerd service" Mar 25 01:30:40.542418 containerd[1946]: time="2025-03-25T01:30:40.540794810Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:30:40.547111 containerd[1946]: time="2025-03-25T01:30:40.547042586Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:30:40.566351 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 25 01:30:40.573371 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:30:40.582521 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:30:40.588457 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 25 01:30:40.590899 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:30:40.663021 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [amazon-ssm-agent] using named pipe channel for IPC Mar 25 01:30:40.762983 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Mar 25 01:30:40.862368 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [amazon-ssm-agent] OS: linux, Arch: arm64 Mar 25 01:30:40.954453 tar[1935]: linux-arm64/README.md Mar 25 01:30:40.964299 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [amazon-ssm-agent] Starting Core Agent Mar 25 01:30:40.992488 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:30:41.064546 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [amazon-ssm-agent] registrar detected. Attempting registration Mar 25 01:30:41.166145 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [Registrar] Starting registrar module Mar 25 01:30:41.267256 amazon-ssm-agent[2012]: 2025-03-25 01:30:40 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Mar 25 01:30:41.603564 amazon-ssm-agent[2012]: 2025-03-25 01:30:41 INFO [EC2Identity] EC2 registration was successful. Mar 25 01:30:41.634936 amazon-ssm-agent[2012]: 2025-03-25 01:30:41 INFO [CredentialRefresher] credentialRefresher has started Mar 25 01:30:41.634936 amazon-ssm-agent[2012]: 2025-03-25 01:30:41 INFO [CredentialRefresher] Starting credentials refresher loop Mar 25 01:30:41.635149 amazon-ssm-agent[2012]: 2025-03-25 01:30:41 INFO EC2RoleProvider Successfully connected with instance profile role credentials Mar 25 01:30:41.703571 amazon-ssm-agent[2012]: 2025-03-25 01:30:41 INFO [CredentialRefresher] Next credential rotation will be in 31.916658228666666 minutes Mar 25 01:30:41.798346 ntpd[1923]: Listen normally on 6 eth0 [fe80::4d7:30ff:fe28:e513%2]:123 Mar 25 01:30:41.798989 ntpd[1923]: 25 Mar 01:30:41 ntpd[1923]: Listen normally on 6 eth0 [fe80::4d7:30ff:fe28:e513%2]:123 Mar 25 01:30:42.129533 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:30:42.144819 (kubelet)[2177]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:30:42.348924 sshd[2150]: Accepted publickey for core from 147.75.109.163 port 43468 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:30:42.355766 sshd-session[2150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:30:42.387651 systemd-logind[1929]: New session 1 of user core. Mar 25 01:30:42.387681 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:30:42.394745 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:30:42.440482 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:30:42.449655 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:30:42.483280 (systemd)[2187]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:30:42.495604 systemd-logind[1929]: New session c1 of user core. Mar 25 01:30:42.519265 containerd[1946]: time="2025-03-25T01:30:42.516249880Z" level=info msg="Start subscribing containerd event" Mar 25 01:30:42.519265 containerd[1946]: time="2025-03-25T01:30:42.516318772Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:30:42.519265 containerd[1946]: time="2025-03-25T01:30:42.516395956Z" level=info msg="Start recovering state" Mar 25 01:30:42.519265 containerd[1946]: time="2025-03-25T01:30:42.516415792Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:30:42.522032 containerd[1946]: time="2025-03-25T01:30:42.520387888Z" level=info msg="Start event monitor" Mar 25 01:30:42.522032 containerd[1946]: time="2025-03-25T01:30:42.521159440Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:30:42.522032 containerd[1946]: time="2025-03-25T01:30:42.521388496Z" level=info msg="Start streaming server" Mar 25 01:30:42.522032 containerd[1946]: time="2025-03-25T01:30:42.521454364Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:30:42.522032 containerd[1946]: time="2025-03-25T01:30:42.521479276Z" level=info msg="runtime interface starting up..." Mar 25 01:30:42.522032 containerd[1946]: time="2025-03-25T01:30:42.521498596Z" level=info msg="starting plugins..." Mar 25 01:30:42.522032 containerd[1946]: time="2025-03-25T01:30:42.521590216Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:30:42.522567 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:30:42.530969 containerd[1946]: time="2025-03-25T01:30:42.522461308Z" level=info msg="containerd successfully booted in 2.077608s" Mar 25 01:30:42.525284 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:30:42.666099 amazon-ssm-agent[2012]: 2025-03-25 01:30:42 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Mar 25 01:30:42.769203 amazon-ssm-agent[2012]: 2025-03-25 01:30:42 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2196) started Mar 25 01:30:42.838743 systemd[2187]: Queued start job for default target default.target. Mar 25 01:30:42.847382 systemd[2187]: Created slice app.slice - User Application Slice. Mar 25 01:30:42.847459 systemd[2187]: Reached target paths.target - Paths. Mar 25 01:30:42.847559 systemd[2187]: Reached target timers.target - Timers. Mar 25 01:30:42.851786 systemd[2187]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:30:42.964099 amazon-ssm-agent[2012]: 2025-03-25 01:30:42 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Mar 25 01:30:42.982811 systemd[2187]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:30:42.983052 systemd[2187]: Reached target sockets.target - Sockets. Mar 25 01:30:42.983137 systemd[2187]: Reached target basic.target - Basic System. Mar 25 01:30:42.983633 systemd[2187]: Reached target default.target - Main User Target. Mar 25 01:30:42.983710 systemd[2187]: Startup finished in 448ms. Mar 25 01:30:42.984216 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:30:42.992524 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:30:42.997670 systemd[1]: Startup finished in 1.110s (kernel) + 9.670s (initrd) + 11.592s (userspace) = 22.373s. Mar 25 01:30:43.163487 systemd[1]: Started sshd@1-172.31.28.242:22-147.75.109.163:43258.service - OpenSSH per-connection server daemon (147.75.109.163:43258). Mar 25 01:30:43.371412 sshd[2216]: Accepted publickey for core from 147.75.109.163 port 43258 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:30:43.376369 sshd-session[2216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:30:43.381454 kubelet[2177]: E0325 01:30:43.381220 2177 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:30:43.386353 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:30:43.386668 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:30:43.387118 systemd-logind[1929]: New session 2 of user core. Mar 25 01:30:43.387209 systemd[1]: kubelet.service: Consumed 1.364s CPU time, 249.1M memory peak. Mar 25 01:30:43.400571 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:30:43.524891 sshd[2219]: Connection closed by 147.75.109.163 port 43258 Mar 25 01:30:43.525735 sshd-session[2216]: pam_unix(sshd:session): session closed for user core Mar 25 01:30:43.532338 systemd[1]: sshd@1-172.31.28.242:22-147.75.109.163:43258.service: Deactivated successfully. Mar 25 01:30:43.535585 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:30:43.537276 systemd-logind[1929]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:30:43.539372 systemd-logind[1929]: Removed session 2. Mar 25 01:30:43.563921 systemd[1]: Started sshd@2-172.31.28.242:22-147.75.109.163:43272.service - OpenSSH per-connection server daemon (147.75.109.163:43272). Mar 25 01:30:43.763396 sshd[2225]: Accepted publickey for core from 147.75.109.163 port 43272 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:30:43.766840 sshd-session[2225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:30:43.778354 systemd-logind[1929]: New session 3 of user core. Mar 25 01:30:43.787516 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:30:43.907269 sshd[2227]: Connection closed by 147.75.109.163 port 43272 Mar 25 01:30:43.908137 sshd-session[2225]: pam_unix(sshd:session): session closed for user core Mar 25 01:30:43.915527 systemd-logind[1929]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:30:43.917397 systemd[1]: sshd@2-172.31.28.242:22-147.75.109.163:43272.service: Deactivated successfully. Mar 25 01:30:43.921644 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:30:43.924489 systemd-logind[1929]: Removed session 3. Mar 25 01:30:43.945958 systemd[1]: Started sshd@3-172.31.28.242:22-147.75.109.163:43274.service - OpenSSH per-connection server daemon (147.75.109.163:43274). Mar 25 01:30:44.141723 sshd[2233]: Accepted publickey for core from 147.75.109.163 port 43274 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:30:44.144301 sshd-session[2233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:30:44.153759 systemd-logind[1929]: New session 4 of user core. Mar 25 01:30:44.163483 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:30:44.288337 sshd[2235]: Connection closed by 147.75.109.163 port 43274 Mar 25 01:30:44.288832 sshd-session[2233]: pam_unix(sshd:session): session closed for user core Mar 25 01:30:44.295366 systemd[1]: sshd@3-172.31.28.242:22-147.75.109.163:43274.service: Deactivated successfully. Mar 25 01:30:44.298903 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:30:44.302435 systemd-logind[1929]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:30:44.304402 systemd-logind[1929]: Removed session 4. Mar 25 01:30:44.323743 systemd[1]: Started sshd@4-172.31.28.242:22-147.75.109.163:43284.service - OpenSSH per-connection server daemon (147.75.109.163:43284). Mar 25 01:30:44.522799 sshd[2241]: Accepted publickey for core from 147.75.109.163 port 43284 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:30:44.524683 sshd-session[2241]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:30:44.533693 systemd-logind[1929]: New session 5 of user core. Mar 25 01:30:44.543463 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:30:44.955388 sudo[2244]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:30:44.956063 sudo[2244]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:30:44.971943 sudo[2244]: pam_unix(sudo:session): session closed for user root Mar 25 01:30:44.996219 sshd[2243]: Connection closed by 147.75.109.163 port 43284 Mar 25 01:30:44.995520 sshd-session[2241]: pam_unix(sshd:session): session closed for user core Mar 25 01:30:45.001475 systemd[1]: sshd@4-172.31.28.242:22-147.75.109.163:43284.service: Deactivated successfully. Mar 25 01:30:45.005130 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:30:45.008581 systemd-logind[1929]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:30:45.010765 systemd-logind[1929]: Removed session 5. Mar 25 01:30:45.032486 systemd[1]: Started sshd@5-172.31.28.242:22-147.75.109.163:43300.service - OpenSSH per-connection server daemon (147.75.109.163:43300). Mar 25 01:30:45.233614 sshd[2250]: Accepted publickey for core from 147.75.109.163 port 43300 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:30:45.236740 sshd-session[2250]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:30:45.246704 systemd-logind[1929]: New session 6 of user core. Mar 25 01:30:45.253484 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:30:45.358877 sudo[2254]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:30:45.359896 sudo[2254]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:30:45.367125 sudo[2254]: pam_unix(sudo:session): session closed for user root Mar 25 01:30:45.377613 sudo[2253]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:30:45.378752 sudo[2253]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:30:45.395668 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:30:45.457663 augenrules[2276]: No rules Mar 25 01:30:45.458985 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:30:45.459538 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:30:45.461496 sudo[2253]: pam_unix(sudo:session): session closed for user root Mar 25 01:30:45.484841 sshd[2252]: Connection closed by 147.75.109.163 port 43300 Mar 25 01:30:45.485654 sshd-session[2250]: pam_unix(sshd:session): session closed for user core Mar 25 01:30:45.492824 systemd[1]: sshd@5-172.31.28.242:22-147.75.109.163:43300.service: Deactivated successfully. Mar 25 01:30:45.496749 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:30:45.498405 systemd-logind[1929]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:30:45.500050 systemd-logind[1929]: Removed session 6. Mar 25 01:30:45.523517 systemd[1]: Started sshd@6-172.31.28.242:22-147.75.109.163:43302.service - OpenSSH per-connection server daemon (147.75.109.163:43302). Mar 25 01:30:45.719573 sshd[2285]: Accepted publickey for core from 147.75.109.163 port 43302 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:30:45.722436 sshd-session[2285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:30:45.732351 systemd-logind[1929]: New session 7 of user core. Mar 25 01:30:45.735477 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:30:45.839710 sudo[2288]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:30:45.840392 sudo[2288]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:30:48.378556 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:30:48.395809 (dockerd)[2306]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:30:50.099348 dockerd[2306]: time="2025-03-25T01:30:50.099249013Z" level=info msg="Starting up" Mar 25 01:30:50.100677 dockerd[2306]: time="2025-03-25T01:30:50.100563963Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:30:50.176632 dockerd[2306]: time="2025-03-25T01:30:50.176569293Z" level=info msg="Loading containers: start." Mar 25 01:30:50.575538 kernel: Initializing XFRM netlink socket Mar 25 01:30:50.578206 (udev-worker)[2330]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:30:50.950097 systemd-networkd[1793]: docker0: Link UP Mar 25 01:30:51.032521 dockerd[2306]: time="2025-03-25T01:30:51.032429472Z" level=info msg="Loading containers: done." Mar 25 01:30:51.062479 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3069775016-merged.mount: Deactivated successfully. Mar 25 01:30:51.066200 dockerd[2306]: time="2025-03-25T01:30:51.066101331Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:30:51.066355 dockerd[2306]: time="2025-03-25T01:30:51.066266333Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:30:51.066550 dockerd[2306]: time="2025-03-25T01:30:51.066503011Z" level=info msg="Daemon has completed initialization" Mar 25 01:30:51.113437 dockerd[2306]: time="2025-03-25T01:30:51.113280270Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:30:51.114098 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:30:52.084596 containerd[1946]: time="2025-03-25T01:30:52.084520115Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\"" Mar 25 01:30:52.817003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1238610870.mount: Deactivated successfully. Mar 25 01:30:53.637266 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:30:53.640020 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:30:53.997707 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:30:54.011947 (kubelet)[2564]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:30:54.094029 kubelet[2564]: E0325 01:30:54.093938 2564 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:30:54.100906 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:30:54.101308 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:30:54.103312 systemd[1]: kubelet.service: Consumed 337ms CPU time, 101.6M memory peak. Mar 25 01:30:55.128498 containerd[1946]: time="2025-03-25T01:30:55.128394469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:55.131046 containerd[1946]: time="2025-03-25T01:30:55.130972363Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.3: active requests=0, bytes read=26231950" Mar 25 01:30:55.132920 containerd[1946]: time="2025-03-25T01:30:55.132843499Z" level=info msg="ImageCreate event name:\"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:55.137700 containerd[1946]: time="2025-03-25T01:30:55.137595042Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:55.140293 containerd[1946]: time="2025-03-25T01:30:55.139587522Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.3\" with image id \"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\", size \"26228750\" in 3.05500637s" Mar 25 01:30:55.140293 containerd[1946]: time="2025-03-25T01:30:55.139641818Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\" returns image reference \"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\"" Mar 25 01:30:55.140777 containerd[1946]: time="2025-03-25T01:30:55.140711479Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\"" Mar 25 01:30:57.391803 containerd[1946]: time="2025-03-25T01:30:57.391483865Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:57.392965 containerd[1946]: time="2025-03-25T01:30:57.392782827Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.3: active requests=0, bytes read=22530032" Mar 25 01:30:57.396240 containerd[1946]: time="2025-03-25T01:30:57.394493687Z" level=info msg="ImageCreate event name:\"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:57.404118 containerd[1946]: time="2025-03-25T01:30:57.404038440Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:57.406418 containerd[1946]: time="2025-03-25T01:30:57.406351267Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.3\" with image id \"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\", size \"23970828\" in 2.265431993s" Mar 25 01:30:57.406552 containerd[1946]: time="2025-03-25T01:30:57.406415184Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\" returns image reference \"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\"" Mar 25 01:30:57.407002 containerd[1946]: time="2025-03-25T01:30:57.406936119Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\"" Mar 25 01:30:59.155951 containerd[1946]: time="2025-03-25T01:30:59.155541156Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:59.157590 containerd[1946]: time="2025-03-25T01:30:59.157426793Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.3: active requests=0, bytes read=17482561" Mar 25 01:30:59.158352 containerd[1946]: time="2025-03-25T01:30:59.158253587Z" level=info msg="ImageCreate event name:\"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:59.163960 containerd[1946]: time="2025-03-25T01:30:59.163872992Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:30:59.165915 containerd[1946]: time="2025-03-25T01:30:59.165732998Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.3\" with image id \"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\", size \"18923375\" in 1.758738804s" Mar 25 01:30:59.165915 containerd[1946]: time="2025-03-25T01:30:59.165788722Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\" returns image reference \"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\"" Mar 25 01:30:59.166897 containerd[1946]: time="2025-03-25T01:30:59.166787174Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 25 01:31:00.786619 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4259359132.mount: Deactivated successfully. Mar 25 01:31:01.341749 containerd[1946]: time="2025-03-25T01:31:01.341689786Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:01.343549 containerd[1946]: time="2025-03-25T01:31:01.343434373Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=27370095" Mar 25 01:31:01.345951 containerd[1946]: time="2025-03-25T01:31:01.345872417Z" level=info msg="ImageCreate event name:\"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:01.350213 containerd[1946]: time="2025-03-25T01:31:01.350107439Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:01.351577 containerd[1946]: time="2025-03-25T01:31:01.351337291Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"27369114\" in 2.184461457s" Mar 25 01:31:01.351577 containerd[1946]: time="2025-03-25T01:31:01.351403103Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\"" Mar 25 01:31:01.352615 containerd[1946]: time="2025-03-25T01:31:01.352576164Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Mar 25 01:31:02.109798 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount434445937.mount: Deactivated successfully. Mar 25 01:31:03.556078 containerd[1946]: time="2025-03-25T01:31:03.555809574Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:03.557264 containerd[1946]: time="2025-03-25T01:31:03.557136243Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Mar 25 01:31:03.558518 containerd[1946]: time="2025-03-25T01:31:03.558428081Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:03.563385 containerd[1946]: time="2025-03-25T01:31:03.563332523Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:03.566438 containerd[1946]: time="2025-03-25T01:31:03.565896085Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 2.212718098s" Mar 25 01:31:03.566438 containerd[1946]: time="2025-03-25T01:31:03.565967689Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Mar 25 01:31:03.567238 containerd[1946]: time="2025-03-25T01:31:03.567165134Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 01:31:04.051357 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2845495155.mount: Deactivated successfully. Mar 25 01:31:04.060719 containerd[1946]: time="2025-03-25T01:31:04.060630019Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:31:04.062167 containerd[1946]: time="2025-03-25T01:31:04.062073569Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Mar 25 01:31:04.063606 containerd[1946]: time="2025-03-25T01:31:04.063509839Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:31:04.068262 containerd[1946]: time="2025-03-25T01:31:04.068118689Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:31:04.070058 containerd[1946]: time="2025-03-25T01:31:04.069806701Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 502.425459ms" Mar 25 01:31:04.070058 containerd[1946]: time="2025-03-25T01:31:04.069877897Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 25 01:31:04.071844 containerd[1946]: time="2025-03-25T01:31:04.071464428Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Mar 25 01:31:04.183890 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:31:04.186892 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:31:04.518726 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:31:04.531930 (kubelet)[2656]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:31:04.625906 kubelet[2656]: E0325 01:31:04.625844 2656 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:31:04.632036 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:31:04.632650 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:31:04.634486 systemd[1]: kubelet.service: Consumed 322ms CPU time, 104.3M memory peak. Mar 25 01:31:04.795039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1361353241.mount: Deactivated successfully. Mar 25 01:31:08.890088 containerd[1946]: time="2025-03-25T01:31:08.889994272Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:08.892279 containerd[1946]: time="2025-03-25T01:31:08.892156274Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812429" Mar 25 01:31:08.894626 containerd[1946]: time="2025-03-25T01:31:08.894542864Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:08.903035 containerd[1946]: time="2025-03-25T01:31:08.902931636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:08.905296 containerd[1946]: time="2025-03-25T01:31:08.905080181Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 4.83354686s" Mar 25 01:31:08.905296 containerd[1946]: time="2025-03-25T01:31:08.905138400Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Mar 25 01:31:09.535137 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 25 01:31:14.684408 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 25 01:31:14.688501 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:31:15.288461 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:31:15.303679 (kubelet)[2748]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:31:15.383220 kubelet[2748]: E0325 01:31:15.382623 2748 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:31:15.387334 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:31:15.387661 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:31:15.388321 systemd[1]: kubelet.service: Consumed 309ms CPU time, 102M memory peak. Mar 25 01:31:16.666565 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:31:16.666931 systemd[1]: kubelet.service: Consumed 309ms CPU time, 102M memory peak. Mar 25 01:31:16.671450 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:31:16.722920 systemd[1]: Reload requested from client PID 2762 ('systemctl') (unit session-7.scope)... Mar 25 01:31:16.722945 systemd[1]: Reloading... Mar 25 01:31:17.020252 zram_generator::config[2810]: No configuration found. Mar 25 01:31:17.262756 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:31:17.490160 systemd[1]: Reloading finished in 766 ms. Mar 25 01:31:17.600041 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:31:17.606832 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:31:17.607384 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:31:17.607479 systemd[1]: kubelet.service: Consumed 252ms CPU time, 90.4M memory peak. Mar 25 01:31:17.611040 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:31:17.933456 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:31:17.952028 (kubelet)[2872]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:31:18.024132 kubelet[2872]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:31:18.024652 kubelet[2872]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 01:31:18.024824 kubelet[2872]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:31:18.025092 kubelet[2872]: I0325 01:31:18.025029 2872 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:31:18.795936 kubelet[2872]: I0325 01:31:18.795878 2872 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 01:31:18.798218 kubelet[2872]: I0325 01:31:18.796368 2872 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:31:18.798218 kubelet[2872]: I0325 01:31:18.797462 2872 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 01:31:18.846670 kubelet[2872]: E0325 01:31:18.846587 2872 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.31.28.242:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.28.242:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:31:18.849371 kubelet[2872]: I0325 01:31:18.849291 2872 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:31:18.864495 kubelet[2872]: I0325 01:31:18.864413 2872 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:31:18.870056 kubelet[2872]: I0325 01:31:18.870011 2872 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:31:18.870557 kubelet[2872]: I0325 01:31:18.870499 2872 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:31:18.870852 kubelet[2872]: I0325 01:31:18.870558 2872 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-28-242","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:31:18.871052 kubelet[2872]: I0325 01:31:18.870893 2872 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:31:18.871052 kubelet[2872]: I0325 01:31:18.870916 2872 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 01:31:18.871220 kubelet[2872]: I0325 01:31:18.871168 2872 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:31:18.877237 kubelet[2872]: I0325 01:31:18.877148 2872 kubelet.go:446] "Attempting to sync node with API server" Mar 25 01:31:18.877237 kubelet[2872]: I0325 01:31:18.877239 2872 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:31:18.877629 kubelet[2872]: I0325 01:31:18.877293 2872 kubelet.go:352] "Adding apiserver pod source" Mar 25 01:31:18.877629 kubelet[2872]: I0325 01:31:18.877327 2872 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:31:18.880211 kubelet[2872]: W0325 01:31:18.879386 2872 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.28.242:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-28-242&limit=500&resourceVersion=0": dial tcp 172.31.28.242:6443: connect: connection refused Mar 25 01:31:18.880211 kubelet[2872]: E0325 01:31:18.879484 2872 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.31.28.242:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-28-242&limit=500&resourceVersion=0\": dial tcp 172.31.28.242:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:31:18.881765 kubelet[2872]: W0325 01:31:18.881681 2872 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.28.242:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.31.28.242:6443: connect: connection refused Mar 25 01:31:18.881889 kubelet[2872]: E0325 01:31:18.881787 2872 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.31.28.242:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.31.28.242:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:31:18.882581 kubelet[2872]: I0325 01:31:18.882515 2872 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:31:18.883556 kubelet[2872]: I0325 01:31:18.883503 2872 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:31:18.883677 kubelet[2872]: W0325 01:31:18.883646 2872 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:31:18.886070 kubelet[2872]: I0325 01:31:18.885617 2872 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 01:31:18.886070 kubelet[2872]: I0325 01:31:18.885680 2872 server.go:1287] "Started kubelet" Mar 25 01:31:18.898555 kubelet[2872]: E0325 01:31:18.896919 2872 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.28.242:6443/api/v1/namespaces/default/events\": dial tcp 172.31.28.242:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-28-242.182fe7a583fc7770 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-28-242,UID:ip-172-31-28-242,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-28-242,},FirstTimestamp:2025-03-25 01:31:18.885652336 +0000 UTC m=+0.926590297,LastTimestamp:2025-03-25 01:31:18.885652336 +0000 UTC m=+0.926590297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-28-242,}" Mar 25 01:31:18.899168 kubelet[2872]: I0325 01:31:18.898901 2872 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:31:18.903320 kubelet[2872]: I0325 01:31:18.903280 2872 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:31:18.903655 kubelet[2872]: I0325 01:31:18.903611 2872 server.go:490] "Adding debug handlers to kubelet server" Mar 25 01:31:18.908528 kubelet[2872]: I0325 01:31:18.908421 2872 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:31:18.909069 kubelet[2872]: I0325 01:31:18.908979 2872 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:31:18.910497 kubelet[2872]: I0325 01:31:18.910367 2872 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:31:18.913987 kubelet[2872]: I0325 01:31:18.913935 2872 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 01:31:18.914498 kubelet[2872]: E0325 01:31:18.914443 2872 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ip-172-31-28-242\" not found" Mar 25 01:31:18.914999 kubelet[2872]: I0325 01:31:18.914953 2872 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:31:18.915116 kubelet[2872]: I0325 01:31:18.915058 2872 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:31:18.919112 kubelet[2872]: W0325 01:31:18.919014 2872 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.28.242:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.28.242:6443: connect: connection refused Mar 25 01:31:18.919505 kubelet[2872]: E0325 01:31:18.919116 2872 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.28.242:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.28.242:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:31:18.920464 kubelet[2872]: E0325 01:31:18.920026 2872 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-242?timeout=10s\": dial tcp 172.31.28.242:6443: connect: connection refused" interval="200ms" Mar 25 01:31:18.920799 kubelet[2872]: I0325 01:31:18.920741 2872 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:31:18.921267 kubelet[2872]: I0325 01:31:18.920915 2872 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:31:18.924585 kubelet[2872]: I0325 01:31:18.924345 2872 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:31:18.931393 kubelet[2872]: E0325 01:31:18.931047 2872 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:31:18.947392 kubelet[2872]: I0325 01:31:18.947329 2872 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:31:18.950144 kubelet[2872]: I0325 01:31:18.950055 2872 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:31:18.950144 kubelet[2872]: I0325 01:31:18.950099 2872 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 01:31:18.950364 kubelet[2872]: I0325 01:31:18.950225 2872 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 01:31:18.950364 kubelet[2872]: I0325 01:31:18.950245 2872 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 01:31:18.950364 kubelet[2872]: E0325 01:31:18.950314 2872 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:31:18.960903 kubelet[2872]: W0325 01:31:18.960348 2872 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.28.242:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.28.242:6443: connect: connection refused Mar 25 01:31:18.960903 kubelet[2872]: E0325 01:31:18.960470 2872 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.28.242:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.28.242:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:31:18.972644 kubelet[2872]: I0325 01:31:18.972508 2872 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 01:31:18.972644 kubelet[2872]: I0325 01:31:18.972570 2872 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 01:31:18.972644 kubelet[2872]: I0325 01:31:18.972607 2872 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:31:18.975323 kubelet[2872]: I0325 01:31:18.974815 2872 policy_none.go:49] "None policy: Start" Mar 25 01:31:18.975323 kubelet[2872]: I0325 01:31:18.974855 2872 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 01:31:18.975323 kubelet[2872]: I0325 01:31:18.974883 2872 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:31:18.986025 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:31:19.002971 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:31:19.009984 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:31:19.015377 kubelet[2872]: E0325 01:31:19.015339 2872 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ip-172-31-28-242\" not found" Mar 25 01:31:19.024236 kubelet[2872]: I0325 01:31:19.023375 2872 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:31:19.024236 kubelet[2872]: I0325 01:31:19.023715 2872 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:31:19.024236 kubelet[2872]: I0325 01:31:19.023738 2872 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:31:19.024236 kubelet[2872]: I0325 01:31:19.024216 2872 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:31:19.028049 kubelet[2872]: E0325 01:31:19.027811 2872 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 01:31:19.028049 kubelet[2872]: E0325 01:31:19.027903 2872 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-28-242\" not found" Mar 25 01:31:19.072102 systemd[1]: Created slice kubepods-burstable-podf7cae47c4ed49f452b4b6db282f4d1f6.slice - libcontainer container kubepods-burstable-podf7cae47c4ed49f452b4b6db282f4d1f6.slice. Mar 25 01:31:19.096995 kubelet[2872]: E0325 01:31:19.096780 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:19.103003 systemd[1]: Created slice kubepods-burstable-pod92b74ee8afc80a99a6ce1b912146a389.slice - libcontainer container kubepods-burstable-pod92b74ee8afc80a99a6ce1b912146a389.slice. Mar 25 01:31:19.109373 kubelet[2872]: E0325 01:31:19.109325 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:19.112814 systemd[1]: Created slice kubepods-burstable-pod338a11a248ff52857bd67f7a49bd0856.slice - libcontainer container kubepods-burstable-pod338a11a248ff52857bd67f7a49bd0856.slice. Mar 25 01:31:19.116508 kubelet[2872]: I0325 01:31:19.116448 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/92b74ee8afc80a99a6ce1b912146a389-kubeconfig\") pod \"kube-scheduler-ip-172-31-28-242\" (UID: \"92b74ee8afc80a99a6ce1b912146a389\") " pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:19.116508 kubelet[2872]: I0325 01:31:19.116509 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/338a11a248ff52857bd67f7a49bd0856-ca-certs\") pod \"kube-apiserver-ip-172-31-28-242\" (UID: \"338a11a248ff52857bd67f7a49bd0856\") " pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:19.116703 kubelet[2872]: I0325 01:31:19.116548 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/338a11a248ff52857bd67f7a49bd0856-k8s-certs\") pod \"kube-apiserver-ip-172-31-28-242\" (UID: \"338a11a248ff52857bd67f7a49bd0856\") " pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:19.116703 kubelet[2872]: I0325 01:31:19.116583 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/338a11a248ff52857bd67f7a49bd0856-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-28-242\" (UID: \"338a11a248ff52857bd67f7a49bd0856\") " pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:19.116703 kubelet[2872]: I0325 01:31:19.116624 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-ca-certs\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:19.116703 kubelet[2872]: I0325 01:31:19.116659 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:19.116703 kubelet[2872]: I0325 01:31:19.116696 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-kubeconfig\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:19.116942 kubelet[2872]: I0325 01:31:19.116732 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-k8s-certs\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:19.116942 kubelet[2872]: I0325 01:31:19.116767 2872 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:19.118084 kubelet[2872]: E0325 01:31:19.117762 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:19.121025 kubelet[2872]: E0325 01:31:19.120968 2872 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-242?timeout=10s\": dial tcp 172.31.28.242:6443: connect: connection refused" interval="400ms" Mar 25 01:31:19.126841 kubelet[2872]: I0325 01:31:19.126664 2872 kubelet_node_status.go:76] "Attempting to register node" node="ip-172-31-28-242" Mar 25 01:31:19.127595 kubelet[2872]: E0325 01:31:19.127527 2872 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.31.28.242:6443/api/v1/nodes\": dial tcp 172.31.28.242:6443: connect: connection refused" node="ip-172-31-28-242" Mar 25 01:31:19.331283 kubelet[2872]: I0325 01:31:19.331088 2872 kubelet_node_status.go:76] "Attempting to register node" node="ip-172-31-28-242" Mar 25 01:31:19.332240 kubelet[2872]: E0325 01:31:19.332148 2872 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.31.28.242:6443/api/v1/nodes\": dial tcp 172.31.28.242:6443: connect: connection refused" node="ip-172-31-28-242" Mar 25 01:31:19.398955 containerd[1946]: time="2025-03-25T01:31:19.398887736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-28-242,Uid:f7cae47c4ed49f452b4b6db282f4d1f6,Namespace:kube-system,Attempt:0,}" Mar 25 01:31:19.411586 containerd[1946]: time="2025-03-25T01:31:19.411461898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-28-242,Uid:92b74ee8afc80a99a6ce1b912146a389,Namespace:kube-system,Attempt:0,}" Mar 25 01:31:19.418966 containerd[1946]: time="2025-03-25T01:31:19.418891389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-28-242,Uid:338a11a248ff52857bd67f7a49bd0856,Namespace:kube-system,Attempt:0,}" Mar 25 01:31:19.444166 containerd[1946]: time="2025-03-25T01:31:19.443859491Z" level=info msg="connecting to shim c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42" address="unix:///run/containerd/s/7e8543f5cd1a832894a3cf6101ca27943e42a26b95afd5f1f1342b99b0a35958" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:31:19.499502 systemd[1]: Started cri-containerd-c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42.scope - libcontainer container c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42. Mar 25 01:31:19.522707 kubelet[2872]: E0325 01:31:19.522628 2872 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.28.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-242?timeout=10s\": dial tcp 172.31.28.242:6443: connect: connection refused" interval="800ms" Mar 25 01:31:19.543257 containerd[1946]: time="2025-03-25T01:31:19.542844642Z" level=info msg="connecting to shim 85bb36f8125216ea2857e665e88c52bbdb11c989c5a33fb5c3ea1acb7460c70d" address="unix:///run/containerd/s/0f6d0ed200e8be72c007bc48c6de48b25b534e80063a2e04c71b86721073272e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:31:19.555364 containerd[1946]: time="2025-03-25T01:31:19.555278594Z" level=info msg="connecting to shim 5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5" address="unix:///run/containerd/s/f57df387c0a5fe00c926d3bbdceb956967472f75174ee638a1388078ca646f86" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:31:19.628689 systemd[1]: Started cri-containerd-5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5.scope - libcontainer container 5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5. Mar 25 01:31:19.646522 systemd[1]: Started cri-containerd-85bb36f8125216ea2857e665e88c52bbdb11c989c5a33fb5c3ea1acb7460c70d.scope - libcontainer container 85bb36f8125216ea2857e665e88c52bbdb11c989c5a33fb5c3ea1acb7460c70d. Mar 25 01:31:19.697632 containerd[1946]: time="2025-03-25T01:31:19.697541479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-28-242,Uid:f7cae47c4ed49f452b4b6db282f4d1f6,Namespace:kube-system,Attempt:0,} returns sandbox id \"c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42\"" Mar 25 01:31:19.710699 containerd[1946]: time="2025-03-25T01:31:19.710640091Z" level=info msg="CreateContainer within sandbox \"c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:31:19.737836 kubelet[2872]: I0325 01:31:19.737780 2872 kubelet_node_status.go:76] "Attempting to register node" node="ip-172-31-28-242" Mar 25 01:31:19.738385 kubelet[2872]: E0325 01:31:19.738334 2872 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.31.28.242:6443/api/v1/nodes\": dial tcp 172.31.28.242:6443: connect: connection refused" node="ip-172-31-28-242" Mar 25 01:31:19.744341 containerd[1946]: time="2025-03-25T01:31:19.743651213Z" level=info msg="Container 01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:19.776440 containerd[1946]: time="2025-03-25T01:31:19.776367930Z" level=info msg="CreateContainer within sandbox \"c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f\"" Mar 25 01:31:19.779765 containerd[1946]: time="2025-03-25T01:31:19.779699432Z" level=info msg="StartContainer for \"01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f\"" Mar 25 01:31:19.782958 containerd[1946]: time="2025-03-25T01:31:19.782896648Z" level=info msg="connecting to shim 01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f" address="unix:///run/containerd/s/7e8543f5cd1a832894a3cf6101ca27943e42a26b95afd5f1f1342b99b0a35958" protocol=ttrpc version=3 Mar 25 01:31:19.806705 containerd[1946]: time="2025-03-25T01:31:19.806624355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-28-242,Uid:338a11a248ff52857bd67f7a49bd0856,Namespace:kube-system,Attempt:0,} returns sandbox id \"85bb36f8125216ea2857e665e88c52bbdb11c989c5a33fb5c3ea1acb7460c70d\"" Mar 25 01:31:19.810639 containerd[1946]: time="2025-03-25T01:31:19.810562273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-28-242,Uid:92b74ee8afc80a99a6ce1b912146a389,Namespace:kube-system,Attempt:0,} returns sandbox id \"5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5\"" Mar 25 01:31:19.814209 containerd[1946]: time="2025-03-25T01:31:19.813281720Z" level=info msg="CreateContainer within sandbox \"85bb36f8125216ea2857e665e88c52bbdb11c989c5a33fb5c3ea1acb7460c70d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:31:19.816424 containerd[1946]: time="2025-03-25T01:31:19.816338199Z" level=info msg="CreateContainer within sandbox \"5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:31:19.836496 systemd[1]: Started cri-containerd-01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f.scope - libcontainer container 01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f. Mar 25 01:31:19.843355 containerd[1946]: time="2025-03-25T01:31:19.843298828Z" level=info msg="Container 217b2d826722019899222dbe7582eac36657c4c63fe4c86d80569e6d80ba3b05: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:19.849712 containerd[1946]: time="2025-03-25T01:31:19.849645453Z" level=info msg="Container e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:19.863636 containerd[1946]: time="2025-03-25T01:31:19.863134758Z" level=info msg="CreateContainer within sandbox \"85bb36f8125216ea2857e665e88c52bbdb11c989c5a33fb5c3ea1acb7460c70d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"217b2d826722019899222dbe7582eac36657c4c63fe4c86d80569e6d80ba3b05\"" Mar 25 01:31:19.865612 containerd[1946]: time="2025-03-25T01:31:19.865526817Z" level=info msg="StartContainer for \"217b2d826722019899222dbe7582eac36657c4c63fe4c86d80569e6d80ba3b05\"" Mar 25 01:31:19.869784 containerd[1946]: time="2025-03-25T01:31:19.869390048Z" level=info msg="connecting to shim 217b2d826722019899222dbe7582eac36657c4c63fe4c86d80569e6d80ba3b05" address="unix:///run/containerd/s/0f6d0ed200e8be72c007bc48c6de48b25b534e80063a2e04c71b86721073272e" protocol=ttrpc version=3 Mar 25 01:31:19.877464 containerd[1946]: time="2025-03-25T01:31:19.877236091Z" level=info msg="CreateContainer within sandbox \"5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca\"" Mar 25 01:31:19.879126 containerd[1946]: time="2025-03-25T01:31:19.879032984Z" level=info msg="StartContainer for \"e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca\"" Mar 25 01:31:19.883526 containerd[1946]: time="2025-03-25T01:31:19.882413433Z" level=info msg="connecting to shim e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca" address="unix:///run/containerd/s/f57df387c0a5fe00c926d3bbdceb956967472f75174ee638a1388078ca646f86" protocol=ttrpc version=3 Mar 25 01:31:19.931491 systemd[1]: Started cri-containerd-e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca.scope - libcontainer container e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca. Mar 25 01:31:19.944478 systemd[1]: Started cri-containerd-217b2d826722019899222dbe7582eac36657c4c63fe4c86d80569e6d80ba3b05.scope - libcontainer container 217b2d826722019899222dbe7582eac36657c4c63fe4c86d80569e6d80ba3b05. Mar 25 01:31:19.952219 kubelet[2872]: W0325 01:31:19.951972 2872 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.28.242:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.28.242:6443: connect: connection refused Mar 25 01:31:19.952219 kubelet[2872]: E0325 01:31:19.952071 2872 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.31.28.242:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.31.28.242:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:31:20.024419 containerd[1946]: time="2025-03-25T01:31:20.024333914Z" level=info msg="StartContainer for \"01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f\" returns successfully" Mar 25 01:31:20.105206 containerd[1946]: time="2025-03-25T01:31:20.104891076Z" level=info msg="StartContainer for \"e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca\" returns successfully" Mar 25 01:31:20.121470 containerd[1946]: time="2025-03-25T01:31:20.121390826Z" level=info msg="StartContainer for \"217b2d826722019899222dbe7582eac36657c4c63fe4c86d80569e6d80ba3b05\" returns successfully" Mar 25 01:31:20.152249 kubelet[2872]: W0325 01:31:20.151857 2872 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.28.242:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.28.242:6443: connect: connection refused Mar 25 01:31:20.152249 kubelet[2872]: E0325 01:31:20.151973 2872 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.31.28.242:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.31.28.242:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:31:20.541169 kubelet[2872]: I0325 01:31:20.541111 2872 kubelet_node_status.go:76] "Attempting to register node" node="ip-172-31-28-242" Mar 25 01:31:21.007768 kubelet[2872]: E0325 01:31:21.007692 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:21.019800 kubelet[2872]: E0325 01:31:21.019749 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:21.028132 kubelet[2872]: E0325 01:31:21.028084 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:22.030711 kubelet[2872]: E0325 01:31:22.030655 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:22.032367 kubelet[2872]: E0325 01:31:22.032318 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:22.032807 kubelet[2872]: E0325 01:31:22.032771 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:23.037664 kubelet[2872]: E0325 01:31:23.037485 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:23.037664 kubelet[2872]: E0325 01:31:23.037499 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:23.628474 update_engine[1930]: I20250325 01:31:23.627266 1930 update_attempter.cc:509] Updating boot flags... Mar 25 01:31:23.791354 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (3150) Mar 25 01:31:24.309209 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 39 scanned by (udev-worker) (3152) Mar 25 01:31:24.465764 kubelet[2872]: E0325 01:31:24.463160 2872 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:25.859615 kubelet[2872]: E0325 01:31:25.859569 2872 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-28-242\" not found" node="ip-172-31-28-242" Mar 25 01:31:25.886165 kubelet[2872]: I0325 01:31:25.886046 2872 apiserver.go:52] "Watching apiserver" Mar 25 01:31:25.892782 kubelet[2872]: E0325 01:31:25.892628 2872 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ip-172-31-28-242.182fe7a583fc7770 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-28-242,UID:ip-172-31-28-242,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-28-242,},FirstTimestamp:2025-03-25 01:31:18.885652336 +0000 UTC m=+0.926590297,LastTimestamp:2025-03-25 01:31:18.885652336 +0000 UTC m=+0.926590297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-28-242,}" Mar 25 01:31:25.915801 kubelet[2872]: I0325 01:31:25.915745 2872 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:31:25.925767 kubelet[2872]: I0325 01:31:25.925436 2872 kubelet_node_status.go:79] "Successfully registered node" node="ip-172-31-28-242" Mar 25 01:31:25.961776 kubelet[2872]: E0325 01:31:25.961389 2872 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ip-172-31-28-242.182fe7a586b0ba0d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-28-242,UID:ip-172-31-28-242,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:ip-172-31-28-242,},FirstTimestamp:2025-03-25 01:31:18.931020301 +0000 UTC m=+0.971958262,LastTimestamp:2025-03-25 01:31:18.931020301 +0000 UTC m=+0.971958262,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-28-242,}" Mar 25 01:31:25.996364 kubelet[2872]: I0325 01:31:25.996316 2872 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:26.014311 kubelet[2872]: E0325 01:31:26.014246 2872 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-28-242\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:26.015370 kubelet[2872]: I0325 01:31:26.015334 2872 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:26.021677 kubelet[2872]: E0325 01:31:26.021313 2872 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-28-242\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:26.021677 kubelet[2872]: I0325 01:31:26.021360 2872 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:26.025246 kubelet[2872]: E0325 01:31:26.024888 2872 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-28-242\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:26.025246 kubelet[2872]: I0325 01:31:26.024936 2872 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:26.032212 kubelet[2872]: E0325 01:31:26.032132 2872 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-28-242\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:27.919310 systemd[1]: Reload requested from client PID 3319 ('systemctl') (unit session-7.scope)... Mar 25 01:31:27.919337 systemd[1]: Reloading... Mar 25 01:31:28.138234 zram_generator::config[3373]: No configuration found. Mar 25 01:31:28.372529 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:31:28.589092 kubelet[2872]: I0325 01:31:28.588987 2872 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:28.635959 systemd[1]: Reloading finished in 715 ms. Mar 25 01:31:28.676648 kubelet[2872]: I0325 01:31:28.676385 2872 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:31:28.676660 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:31:28.694210 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:31:28.694773 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:31:28.694875 systemd[1]: kubelet.service: Consumed 1.786s CPU time, 123.7M memory peak. Mar 25 01:31:28.699764 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:31:29.070440 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:31:29.093206 (kubelet)[3424]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:31:29.202802 kubelet[3424]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:31:29.202802 kubelet[3424]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 01:31:29.202802 kubelet[3424]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:31:29.204003 kubelet[3424]: I0325 01:31:29.203203 3424 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:31:29.216207 kubelet[3424]: I0325 01:31:29.215672 3424 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 01:31:29.216207 kubelet[3424]: I0325 01:31:29.215719 3424 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:31:29.216207 kubelet[3424]: I0325 01:31:29.216158 3424 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 01:31:29.218828 kubelet[3424]: I0325 01:31:29.218784 3424 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:31:29.229085 kubelet[3424]: I0325 01:31:29.229040 3424 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:31:29.238622 kubelet[3424]: I0325 01:31:29.238233 3424 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:31:29.245050 kubelet[3424]: I0325 01:31:29.244989 3424 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:31:29.245671 kubelet[3424]: I0325 01:31:29.245464 3424 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:31:29.245857 kubelet[3424]: I0325 01:31:29.245531 3424 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-28-242","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:31:29.246025 kubelet[3424]: I0325 01:31:29.245861 3424 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:31:29.246025 kubelet[3424]: I0325 01:31:29.245883 3424 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 01:31:29.246025 kubelet[3424]: I0325 01:31:29.245972 3424 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:31:29.246350 kubelet[3424]: I0325 01:31:29.246280 3424 kubelet.go:446] "Attempting to sync node with API server" Mar 25 01:31:29.246350 kubelet[3424]: I0325 01:31:29.246315 3424 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:31:29.246464 kubelet[3424]: I0325 01:31:29.246358 3424 kubelet.go:352] "Adding apiserver pod source" Mar 25 01:31:29.246464 kubelet[3424]: I0325 01:31:29.246405 3424 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:31:29.252565 kubelet[3424]: I0325 01:31:29.252478 3424 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:31:29.253466 kubelet[3424]: I0325 01:31:29.253406 3424 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:31:29.255170 kubelet[3424]: I0325 01:31:29.254347 3424 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 01:31:29.255170 kubelet[3424]: I0325 01:31:29.254401 3424 server.go:1287] "Started kubelet" Mar 25 01:31:29.267203 kubelet[3424]: I0325 01:31:29.267103 3424 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:31:29.274995 kubelet[3424]: I0325 01:31:29.274406 3424 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:31:29.279132 kubelet[3424]: I0325 01:31:29.279078 3424 server.go:490] "Adding debug handlers to kubelet server" Mar 25 01:31:29.300923 kubelet[3424]: I0325 01:31:29.299480 3424 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:31:29.300923 kubelet[3424]: I0325 01:31:29.299874 3424 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:31:29.300923 kubelet[3424]: I0325 01:31:29.300264 3424 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:31:29.304480 kubelet[3424]: I0325 01:31:29.304432 3424 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 01:31:29.305253 kubelet[3424]: E0325 01:31:29.305164 3424 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ip-172-31-28-242\" not found" Mar 25 01:31:29.312345 kubelet[3424]: I0325 01:31:29.312298 3424 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:31:29.312647 kubelet[3424]: I0325 01:31:29.312614 3424 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:31:29.317148 kubelet[3424]: I0325 01:31:29.317086 3424 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:31:29.347992 kubelet[3424]: I0325 01:31:29.345547 3424 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:31:29.347992 kubelet[3424]: I0325 01:31:29.345623 3424 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 01:31:29.347992 kubelet[3424]: I0325 01:31:29.345664 3424 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 01:31:29.347992 kubelet[3424]: I0325 01:31:29.345707 3424 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 01:31:29.347992 kubelet[3424]: E0325 01:31:29.345828 3424 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:31:29.390323 kubelet[3424]: I0325 01:31:29.390081 3424 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:31:29.393697 kubelet[3424]: I0325 01:31:29.393646 3424 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:31:29.420228 kubelet[3424]: E0325 01:31:29.418603 3424 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:31:29.422252 kubelet[3424]: I0325 01:31:29.420812 3424 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:31:29.446228 kubelet[3424]: E0325 01:31:29.446097 3424 kubelet.go:2412] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.542801 3424 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.542856 3424 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.542910 3424 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.543365 3424 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.543389 3424 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.543437 3424 policy_none.go:49] "None policy: Start" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.543455 3424 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.543475 3424 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:31:29.544127 kubelet[3424]: I0325 01:31:29.543655 3424 state_mem.go:75] "Updated machine memory state" Mar 25 01:31:29.557719 kubelet[3424]: I0325 01:31:29.557684 3424 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:31:29.559700 kubelet[3424]: I0325 01:31:29.559503 3424 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:31:29.559700 kubelet[3424]: I0325 01:31:29.559563 3424 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:31:29.561020 kubelet[3424]: I0325 01:31:29.560972 3424 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:31:29.574733 kubelet[3424]: E0325 01:31:29.573003 3424 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 01:31:29.648372 kubelet[3424]: I0325 01:31:29.647513 3424 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:29.649061 kubelet[3424]: I0325 01:31:29.649014 3424 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:29.652313 kubelet[3424]: I0325 01:31:29.647529 3424 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:29.667436 kubelet[3424]: E0325 01:31:29.667354 3424 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-28-242\" already exists" pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:29.680425 kubelet[3424]: I0325 01:31:29.679420 3424 kubelet_node_status.go:76] "Attempting to register node" node="ip-172-31-28-242" Mar 25 01:31:29.691354 kubelet[3424]: I0325 01:31:29.691288 3424 kubelet_node_status.go:125] "Node was previously registered" node="ip-172-31-28-242" Mar 25 01:31:29.691524 kubelet[3424]: I0325 01:31:29.691420 3424 kubelet_node_status.go:79] "Successfully registered node" node="ip-172-31-28-242" Mar 25 01:31:29.717256 kubelet[3424]: I0325 01:31:29.717166 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/338a11a248ff52857bd67f7a49bd0856-ca-certs\") pod \"kube-apiserver-ip-172-31-28-242\" (UID: \"338a11a248ff52857bd67f7a49bd0856\") " pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:29.717473 kubelet[3424]: I0325 01:31:29.717267 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:29.717473 kubelet[3424]: I0325 01:31:29.717312 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-kubeconfig\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:29.717473 kubelet[3424]: I0325 01:31:29.717351 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/92b74ee8afc80a99a6ce1b912146a389-kubeconfig\") pod \"kube-scheduler-ip-172-31-28-242\" (UID: \"92b74ee8afc80a99a6ce1b912146a389\") " pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:29.717473 kubelet[3424]: I0325 01:31:29.717389 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/338a11a248ff52857bd67f7a49bd0856-k8s-certs\") pod \"kube-apiserver-ip-172-31-28-242\" (UID: \"338a11a248ff52857bd67f7a49bd0856\") " pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:29.717473 kubelet[3424]: I0325 01:31:29.717428 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/338a11a248ff52857bd67f7a49bd0856-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-28-242\" (UID: \"338a11a248ff52857bd67f7a49bd0856\") " pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:29.717961 kubelet[3424]: I0325 01:31:29.717466 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-ca-certs\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:29.717961 kubelet[3424]: I0325 01:31:29.717502 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-k8s-certs\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:29.717961 kubelet[3424]: I0325 01:31:29.717545 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f7cae47c4ed49f452b4b6db282f4d1f6-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-28-242\" (UID: \"f7cae47c4ed49f452b4b6db282f4d1f6\") " pod="kube-system/kube-controller-manager-ip-172-31-28-242" Mar 25 01:31:30.247503 kubelet[3424]: I0325 01:31:30.247099 3424 apiserver.go:52] "Watching apiserver" Mar 25 01:31:30.312765 kubelet[3424]: I0325 01:31:30.312587 3424 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:31:30.469236 kubelet[3424]: I0325 01:31:30.468601 3424 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:30.470869 kubelet[3424]: I0325 01:31:30.469049 3424 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:30.485763 kubelet[3424]: E0325 01:31:30.485462 3424 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-28-242\" already exists" pod="kube-system/kube-scheduler-ip-172-31-28-242" Mar 25 01:31:30.490912 kubelet[3424]: E0325 01:31:30.490871 3424 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-28-242\" already exists" pod="kube-system/kube-apiserver-ip-172-31-28-242" Mar 25 01:31:30.560200 kubelet[3424]: I0325 01:31:30.559902 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-28-242" podStartSLOduration=1.5598787060000001 podStartE2EDuration="1.559878706s" podCreationTimestamp="2025-03-25 01:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:31:30.5287066 +0000 UTC m=+1.425139104" watchObservedRunningTime="2025-03-25 01:31:30.559878706 +0000 UTC m=+1.456311210" Mar 25 01:31:30.610604 kubelet[3424]: I0325 01:31:30.610045 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-28-242" podStartSLOduration=2.610020307 podStartE2EDuration="2.610020307s" podCreationTimestamp="2025-03-25 01:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:31:30.564004402 +0000 UTC m=+1.460436918" watchObservedRunningTime="2025-03-25 01:31:30.610020307 +0000 UTC m=+1.506452799" Mar 25 01:31:30.657526 kubelet[3424]: I0325 01:31:30.657434 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-28-242" podStartSLOduration=1.657412247 podStartE2EDuration="1.657412247s" podCreationTimestamp="2025-03-25 01:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:31:30.611105236 +0000 UTC m=+1.507537752" watchObservedRunningTime="2025-03-25 01:31:30.657412247 +0000 UTC m=+1.553844811" Mar 25 01:31:34.959985 kubelet[3424]: I0325 01:31:34.959884 3424 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:31:34.961172 containerd[1946]: time="2025-03-25T01:31:34.961092395Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:31:34.963619 kubelet[3424]: I0325 01:31:34.961581 3424 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:31:35.437832 sudo[2288]: pam_unix(sudo:session): session closed for user root Mar 25 01:31:35.462320 sshd[2287]: Connection closed by 147.75.109.163 port 43302 Mar 25 01:31:35.463493 sshd-session[2285]: pam_unix(sshd:session): session closed for user core Mar 25 01:31:35.468947 systemd[1]: sshd@6-172.31.28.242:22-147.75.109.163:43302.service: Deactivated successfully. Mar 25 01:31:35.475003 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:31:35.477345 systemd[1]: session-7.scope: Consumed 11.374s CPU time, 228.7M memory peak. Mar 25 01:31:35.486267 systemd-logind[1929]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:31:35.496871 systemd-logind[1929]: Removed session 7. Mar 25 01:31:35.556298 kubelet[3424]: I0325 01:31:35.555592 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/dd7fff82-02cf-427a-b906-4910358c21a1-xtables-lock\") pod \"kube-proxy-dgcp6\" (UID: \"dd7fff82-02cf-427a-b906-4910358c21a1\") " pod="kube-system/kube-proxy-dgcp6" Mar 25 01:31:35.556298 kubelet[3424]: I0325 01:31:35.555666 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w82tg\" (UniqueName: \"kubernetes.io/projected/dd7fff82-02cf-427a-b906-4910358c21a1-kube-api-access-w82tg\") pod \"kube-proxy-dgcp6\" (UID: \"dd7fff82-02cf-427a-b906-4910358c21a1\") " pod="kube-system/kube-proxy-dgcp6" Mar 25 01:31:35.556298 kubelet[3424]: I0325 01:31:35.555719 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/dd7fff82-02cf-427a-b906-4910358c21a1-kube-proxy\") pod \"kube-proxy-dgcp6\" (UID: \"dd7fff82-02cf-427a-b906-4910358c21a1\") " pod="kube-system/kube-proxy-dgcp6" Mar 25 01:31:35.556298 kubelet[3424]: I0325 01:31:35.555758 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd7fff82-02cf-427a-b906-4910358c21a1-lib-modules\") pod \"kube-proxy-dgcp6\" (UID: \"dd7fff82-02cf-427a-b906-4910358c21a1\") " pod="kube-system/kube-proxy-dgcp6" Mar 25 01:31:35.580125 systemd[1]: Created slice kubepods-besteffort-poddd7fff82_02cf_427a_b906_4910358c21a1.slice - libcontainer container kubepods-besteffort-poddd7fff82_02cf_427a_b906_4910358c21a1.slice. Mar 25 01:31:35.892749 containerd[1946]: time="2025-03-25T01:31:35.892563973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dgcp6,Uid:dd7fff82-02cf-427a-b906-4910358c21a1,Namespace:kube-system,Attempt:0,}" Mar 25 01:31:35.955618 containerd[1946]: time="2025-03-25T01:31:35.955236186Z" level=info msg="connecting to shim 4637b417fffc1ac301fc2ba69f61cd0ddfe7abb35ccdc31b2b3711d460e0ff3b" address="unix:///run/containerd/s/7d8d63985f2082b7fa4c5a0fc6d74faf8cbb8de6fa846e0f9ac17c0053e5c747" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:31:35.958804 kubelet[3424]: I0325 01:31:35.958393 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1f7cfbe7-e1b0-4d7e-aae9-36d8a4f58c32-var-lib-calico\") pod \"tigera-operator-ccfc44587-f2mn7\" (UID: \"1f7cfbe7-e1b0-4d7e-aae9-36d8a4f58c32\") " pod="tigera-operator/tigera-operator-ccfc44587-f2mn7" Mar 25 01:31:35.958804 kubelet[3424]: I0325 01:31:35.958594 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxb6m\" (UniqueName: \"kubernetes.io/projected/1f7cfbe7-e1b0-4d7e-aae9-36d8a4f58c32-kube-api-access-cxb6m\") pod \"tigera-operator-ccfc44587-f2mn7\" (UID: \"1f7cfbe7-e1b0-4d7e-aae9-36d8a4f58c32\") " pod="tigera-operator/tigera-operator-ccfc44587-f2mn7" Mar 25 01:31:35.966750 systemd[1]: Created slice kubepods-besteffort-pod1f7cfbe7_e1b0_4d7e_aae9_36d8a4f58c32.slice - libcontainer container kubepods-besteffort-pod1f7cfbe7_e1b0_4d7e_aae9_36d8a4f58c32.slice. Mar 25 01:31:36.045229 systemd[1]: Started cri-containerd-4637b417fffc1ac301fc2ba69f61cd0ddfe7abb35ccdc31b2b3711d460e0ff3b.scope - libcontainer container 4637b417fffc1ac301fc2ba69f61cd0ddfe7abb35ccdc31b2b3711d460e0ff3b. Mar 25 01:31:36.139339 containerd[1946]: time="2025-03-25T01:31:36.139288326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dgcp6,Uid:dd7fff82-02cf-427a-b906-4910358c21a1,Namespace:kube-system,Attempt:0,} returns sandbox id \"4637b417fffc1ac301fc2ba69f61cd0ddfe7abb35ccdc31b2b3711d460e0ff3b\"" Mar 25 01:31:36.146678 containerd[1946]: time="2025-03-25T01:31:36.146528792Z" level=info msg="CreateContainer within sandbox \"4637b417fffc1ac301fc2ba69f61cd0ddfe7abb35ccdc31b2b3711d460e0ff3b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:31:36.173698 containerd[1946]: time="2025-03-25T01:31:36.173562009Z" level=info msg="Container 6f1dc812af4183fbb0f2aa6ced4e0f10ad1f88d75da60f54d0ec96eeedc641fa: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:36.194381 containerd[1946]: time="2025-03-25T01:31:36.194260582Z" level=info msg="CreateContainer within sandbox \"4637b417fffc1ac301fc2ba69f61cd0ddfe7abb35ccdc31b2b3711d460e0ff3b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"6f1dc812af4183fbb0f2aa6ced4e0f10ad1f88d75da60f54d0ec96eeedc641fa\"" Mar 25 01:31:36.197975 containerd[1946]: time="2025-03-25T01:31:36.196492990Z" level=info msg="StartContainer for \"6f1dc812af4183fbb0f2aa6ced4e0f10ad1f88d75da60f54d0ec96eeedc641fa\"" Mar 25 01:31:36.199862 containerd[1946]: time="2025-03-25T01:31:36.199769091Z" level=info msg="connecting to shim 6f1dc812af4183fbb0f2aa6ced4e0f10ad1f88d75da60f54d0ec96eeedc641fa" address="unix:///run/containerd/s/7d8d63985f2082b7fa4c5a0fc6d74faf8cbb8de6fa846e0f9ac17c0053e5c747" protocol=ttrpc version=3 Mar 25 01:31:36.245555 systemd[1]: Started cri-containerd-6f1dc812af4183fbb0f2aa6ced4e0f10ad1f88d75da60f54d0ec96eeedc641fa.scope - libcontainer container 6f1dc812af4183fbb0f2aa6ced4e0f10ad1f88d75da60f54d0ec96eeedc641fa. Mar 25 01:31:36.279832 containerd[1946]: time="2025-03-25T01:31:36.279779098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-f2mn7,Uid:1f7cfbe7-e1b0-4d7e-aae9-36d8a4f58c32,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:31:36.329408 containerd[1946]: time="2025-03-25T01:31:36.329348201Z" level=info msg="connecting to shim 1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e" address="unix:///run/containerd/s/371c5779e91cc7e5da3c74d3ebff377a411752d38d835888bc4a90272e109609" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:31:36.391802 containerd[1946]: time="2025-03-25T01:31:36.391706519Z" level=info msg="StartContainer for \"6f1dc812af4183fbb0f2aa6ced4e0f10ad1f88d75da60f54d0ec96eeedc641fa\" returns successfully" Mar 25 01:31:36.398414 systemd[1]: Started cri-containerd-1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e.scope - libcontainer container 1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e. Mar 25 01:31:36.521737 containerd[1946]: time="2025-03-25T01:31:36.521653173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-f2mn7,Uid:1f7cfbe7-e1b0-4d7e-aae9-36d8a4f58c32,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e\"" Mar 25 01:31:36.530226 containerd[1946]: time="2025-03-25T01:31:36.529072266Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:31:36.628654 kubelet[3424]: I0325 01:31:36.628382 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-dgcp6" podStartSLOduration=1.628360265 podStartE2EDuration="1.628360265s" podCreationTimestamp="2025-03-25 01:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:31:36.528406107 +0000 UTC m=+7.424838623" watchObservedRunningTime="2025-03-25 01:31:36.628360265 +0000 UTC m=+7.524792757" Mar 25 01:31:36.683462 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3767722009.mount: Deactivated successfully. Mar 25 01:31:40.036946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2644729381.mount: Deactivated successfully. Mar 25 01:31:40.751560 containerd[1946]: time="2025-03-25T01:31:40.751505454Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:40.753721 containerd[1946]: time="2025-03-25T01:31:40.753642676Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 25 01:31:40.754729 containerd[1946]: time="2025-03-25T01:31:40.754362400Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:40.758239 containerd[1946]: time="2025-03-25T01:31:40.758045902Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:40.759603 containerd[1946]: time="2025-03-25T01:31:40.759557146Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 4.230419921s" Mar 25 01:31:40.759846 containerd[1946]: time="2025-03-25T01:31:40.759714831Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 25 01:31:40.765474 containerd[1946]: time="2025-03-25T01:31:40.764505726Z" level=info msg="CreateContainer within sandbox \"1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:31:40.778123 containerd[1946]: time="2025-03-25T01:31:40.778040080Z" level=info msg="Container d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:40.794337 containerd[1946]: time="2025-03-25T01:31:40.794270492Z" level=info msg="CreateContainer within sandbox \"1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836\"" Mar 25 01:31:40.797747 containerd[1946]: time="2025-03-25T01:31:40.795896483Z" level=info msg="StartContainer for \"d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836\"" Mar 25 01:31:40.803875 containerd[1946]: time="2025-03-25T01:31:40.803805399Z" level=info msg="connecting to shim d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836" address="unix:///run/containerd/s/371c5779e91cc7e5da3c74d3ebff377a411752d38d835888bc4a90272e109609" protocol=ttrpc version=3 Mar 25 01:31:40.843485 systemd[1]: Started cri-containerd-d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836.scope - libcontainer container d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836. Mar 25 01:31:40.906454 containerd[1946]: time="2025-03-25T01:31:40.906361871Z" level=info msg="StartContainer for \"d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836\" returns successfully" Mar 25 01:31:48.054505 kubelet[3424]: I0325 01:31:48.054400 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-ccfc44587-f2mn7" podStartSLOduration=8.818884882 podStartE2EDuration="13.054376189s" podCreationTimestamp="2025-03-25 01:31:35 +0000 UTC" firstStartedPulling="2025-03-25 01:31:36.526119295 +0000 UTC m=+7.422551787" lastFinishedPulling="2025-03-25 01:31:40.761610614 +0000 UTC m=+11.658043094" observedRunningTime="2025-03-25 01:31:41.542598296 +0000 UTC m=+12.439030800" watchObservedRunningTime="2025-03-25 01:31:48.054376189 +0000 UTC m=+18.950808693" Mar 25 01:31:48.076092 systemd[1]: Created slice kubepods-besteffort-pod639f69a1_50d6_482c_b6a2_a6da1f050cc4.slice - libcontainer container kubepods-besteffort-pod639f69a1_50d6_482c_b6a2_a6da1f050cc4.slice. Mar 25 01:31:48.077459 kubelet[3424]: I0325 01:31:48.076980 3424 status_manager.go:890] "Failed to get status for pod" podUID="639f69a1-50d6-482c-b6a2-a6da1f050cc4" pod="calico-system/calico-typha-6bb86d8ff5-92gzk" err="pods \"calico-typha-6bb86d8ff5-92gzk\" is forbidden: User \"system:node:ip-172-31-28-242\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-28-242' and this object" Mar 25 01:31:48.077459 kubelet[3424]: W0325 01:31:48.077078 3424 reflector.go:569] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ip-172-31-28-242" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-28-242' and this object Mar 25 01:31:48.077459 kubelet[3424]: E0325 01:31:48.077126 3424 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"tigera-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ip-172-31-28-242\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-28-242' and this object" logger="UnhandledError" Mar 25 01:31:48.077459 kubelet[3424]: W0325 01:31:48.077083 3424 reflector.go:569] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ip-172-31-28-242" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-28-242' and this object Mar 25 01:31:48.077459 kubelet[3424]: W0325 01:31:48.077160 3424 reflector.go:569] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ip-172-31-28-242" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ip-172-31-28-242' and this object Mar 25 01:31:48.078541 kubelet[3424]: E0325 01:31:48.077169 3424 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-172-31-28-242\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-28-242' and this object" logger="UnhandledError" Mar 25 01:31:48.079355 kubelet[3424]: E0325 01:31:48.079251 3424 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"typha-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ip-172-31-28-242\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ip-172-31-28-242' and this object" logger="UnhandledError" Mar 25 01:31:48.142639 kubelet[3424]: I0325 01:31:48.142121 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/639f69a1-50d6-482c-b6a2-a6da1f050cc4-tigera-ca-bundle\") pod \"calico-typha-6bb86d8ff5-92gzk\" (UID: \"639f69a1-50d6-482c-b6a2-a6da1f050cc4\") " pod="calico-system/calico-typha-6bb86d8ff5-92gzk" Mar 25 01:31:48.143090 kubelet[3424]: I0325 01:31:48.142936 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qmpk\" (UniqueName: \"kubernetes.io/projected/639f69a1-50d6-482c-b6a2-a6da1f050cc4-kube-api-access-4qmpk\") pod \"calico-typha-6bb86d8ff5-92gzk\" (UID: \"639f69a1-50d6-482c-b6a2-a6da1f050cc4\") " pod="calico-system/calico-typha-6bb86d8ff5-92gzk" Mar 25 01:31:48.143090 kubelet[3424]: I0325 01:31:48.143039 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/639f69a1-50d6-482c-b6a2-a6da1f050cc4-typha-certs\") pod \"calico-typha-6bb86d8ff5-92gzk\" (UID: \"639f69a1-50d6-482c-b6a2-a6da1f050cc4\") " pod="calico-system/calico-typha-6bb86d8ff5-92gzk" Mar 25 01:31:48.299275 systemd[1]: Created slice kubepods-besteffort-pod40826026_4ad9_4c82_beae_ff94b5d3fb06.slice - libcontainer container kubepods-besteffort-pod40826026_4ad9_4c82_beae_ff94b5d3fb06.slice. Mar 25 01:31:48.345411 kubelet[3424]: I0325 01:31:48.345100 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-flexvol-driver-host\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345411 kubelet[3424]: I0325 01:31:48.345227 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbf9j\" (UniqueName: \"kubernetes.io/projected/40826026-4ad9-4c82-beae-ff94b5d3fb06-kube-api-access-qbf9j\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345411 kubelet[3424]: I0325 01:31:48.345312 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-policysync\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345411 kubelet[3424]: I0325 01:31:48.345376 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40826026-4ad9-4c82-beae-ff94b5d3fb06-tigera-ca-bundle\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345735 kubelet[3424]: I0325 01:31:48.345430 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-var-lib-calico\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345735 kubelet[3424]: I0325 01:31:48.345470 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-cni-net-dir\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345735 kubelet[3424]: I0325 01:31:48.345510 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-lib-modules\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345735 kubelet[3424]: I0325 01:31:48.345563 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/40826026-4ad9-4c82-beae-ff94b5d3fb06-node-certs\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345735 kubelet[3424]: I0325 01:31:48.345611 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-cni-log-dir\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345995 kubelet[3424]: I0325 01:31:48.345651 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-cni-bin-dir\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345995 kubelet[3424]: I0325 01:31:48.345693 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-xtables-lock\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.345995 kubelet[3424]: I0325 01:31:48.345733 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/40826026-4ad9-4c82-beae-ff94b5d3fb06-var-run-calico\") pod \"calico-node-6cfvs\" (UID: \"40826026-4ad9-4c82-beae-ff94b5d3fb06\") " pod="calico-system/calico-node-6cfvs" Mar 25 01:31:48.448740 kubelet[3424]: E0325 01:31:48.448702 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.448946 kubelet[3424]: W0325 01:31:48.448920 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.449412 kubelet[3424]: E0325 01:31:48.449052 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.451893 kubelet[3424]: E0325 01:31:48.450521 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.452423 kubelet[3424]: W0325 01:31:48.452106 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.452423 kubelet[3424]: E0325 01:31:48.452173 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.452811 kubelet[3424]: E0325 01:31:48.452785 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.452938 kubelet[3424]: W0325 01:31:48.452913 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.453298 kubelet[3424]: E0325 01:31:48.453071 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.453860 kubelet[3424]: E0325 01:31:48.453603 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.453860 kubelet[3424]: W0325 01:31:48.453631 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.455448 kubelet[3424]: E0325 01:31:48.454774 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.455448 kubelet[3424]: E0325 01:31:48.454968 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.455448 kubelet[3424]: W0325 01:31:48.454991 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.455448 kubelet[3424]: E0325 01:31:48.455052 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.456302 kubelet[3424]: E0325 01:31:48.456266 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.456882 kubelet[3424]: W0325 01:31:48.456724 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.457333 kubelet[3424]: E0325 01:31:48.457106 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.458172 kubelet[3424]: E0325 01:31:48.458133 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.458493 kubelet[3424]: W0325 01:31:48.458373 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.458493 kubelet[3424]: E0325 01:31:48.458451 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.459271 kubelet[3424]: E0325 01:31:48.458971 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.459271 kubelet[3424]: W0325 01:31:48.459017 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.459271 kubelet[3424]: E0325 01:31:48.459242 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.459970 kubelet[3424]: E0325 01:31:48.459818 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.459970 kubelet[3424]: W0325 01:31:48.459849 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.460471 kubelet[3424]: E0325 01:31:48.460265 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.460878 kubelet[3424]: E0325 01:31:48.460664 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.460878 kubelet[3424]: W0325 01:31:48.460692 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.461109 kubelet[3424]: E0325 01:31:48.461079 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.461705 kubelet[3424]: E0325 01:31:48.461511 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.461705 kubelet[3424]: W0325 01:31:48.461538 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.461705 kubelet[3424]: E0325 01:31:48.461604 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.462353 kubelet[3424]: E0325 01:31:48.462223 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.462353 kubelet[3424]: W0325 01:31:48.462252 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.462353 kubelet[3424]: E0325 01:31:48.462314 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.463249 kubelet[3424]: E0325 01:31:48.462894 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.463249 kubelet[3424]: W0325 01:31:48.462922 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.463466 kubelet[3424]: E0325 01:31:48.463254 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.463933 kubelet[3424]: E0325 01:31:48.463648 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.463933 kubelet[3424]: W0325 01:31:48.463678 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.463933 kubelet[3424]: E0325 01:31:48.463794 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.464744 kubelet[3424]: E0325 01:31:48.464544 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.464744 kubelet[3424]: W0325 01:31:48.464575 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.465225 kubelet[3424]: E0325 01:31:48.464962 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.465651 kubelet[3424]: E0325 01:31:48.465456 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.465651 kubelet[3424]: W0325 01:31:48.465483 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.465651 kubelet[3424]: E0325 01:31:48.465552 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.466526 kubelet[3424]: E0325 01:31:48.466369 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.466526 kubelet[3424]: W0325 01:31:48.466400 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.467292 kubelet[3424]: E0325 01:31:48.467076 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.467292 kubelet[3424]: W0325 01:31:48.467105 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.467292 kubelet[3424]: E0325 01:31:48.467251 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.467583 kubelet[3424]: E0325 01:31:48.467307 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.468337 kubelet[3424]: E0325 01:31:48.467953 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.468337 kubelet[3424]: W0325 01:31:48.467987 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.468337 kubelet[3424]: E0325 01:31:48.468104 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.469111 kubelet[3424]: E0325 01:31:48.468839 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.469111 kubelet[3424]: W0325 01:31:48.468870 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.469771 kubelet[3424]: E0325 01:31:48.469538 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.470209 kubelet[3424]: E0325 01:31:48.470009 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.470209 kubelet[3424]: W0325 01:31:48.470038 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.470209 kubelet[3424]: E0325 01:31:48.470108 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.473663 kubelet[3424]: E0325 01:31:48.473397 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.473663 kubelet[3424]: W0325 01:31:48.473436 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.474547 kubelet[3424]: E0325 01:31:48.474300 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.474547 kubelet[3424]: W0325 01:31:48.474334 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.475299 kubelet[3424]: E0325 01:31:48.474989 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.475299 kubelet[3424]: W0325 01:31:48.475018 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.475299 kubelet[3424]: E0325 01:31:48.475049 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.475299 kubelet[3424]: E0325 01:31:48.475104 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.475299 kubelet[3424]: E0325 01:31:48.475146 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.480749 kubelet[3424]: E0325 01:31:48.480550 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.480749 kubelet[3424]: W0325 01:31:48.480584 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.480749 kubelet[3424]: E0325 01:31:48.480646 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.560565 kubelet[3424]: E0325 01:31:48.557923 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:31:48.637274 kubelet[3424]: E0325 01:31:48.636988 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.637274 kubelet[3424]: W0325 01:31:48.637028 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.637274 kubelet[3424]: E0325 01:31:48.637063 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.638830 kubelet[3424]: E0325 01:31:48.638560 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.638830 kubelet[3424]: W0325 01:31:48.638598 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.638830 kubelet[3424]: E0325 01:31:48.638672 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.640784 kubelet[3424]: E0325 01:31:48.640622 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.640784 kubelet[3424]: W0325 01:31:48.640658 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.640784 kubelet[3424]: E0325 01:31:48.640704 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.641283 kubelet[3424]: E0325 01:31:48.641084 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.641283 kubelet[3424]: W0325 01:31:48.641104 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.641283 kubelet[3424]: E0325 01:31:48.641145 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.643146 kubelet[3424]: E0325 01:31:48.641823 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.643146 kubelet[3424]: W0325 01:31:48.641857 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.643146 kubelet[3424]: E0325 01:31:48.641884 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.644340 kubelet[3424]: E0325 01:31:48.644297 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.644441 kubelet[3424]: W0325 01:31:48.644339 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.644441 kubelet[3424]: E0325 01:31:48.644373 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.645000 kubelet[3424]: E0325 01:31:48.644955 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.645000 kubelet[3424]: W0325 01:31:48.644991 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.645287 kubelet[3424]: E0325 01:31:48.645022 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.646580 kubelet[3424]: E0325 01:31:48.646515 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.646580 kubelet[3424]: W0325 01:31:48.646555 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.646872 kubelet[3424]: E0325 01:31:48.646587 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.647252 kubelet[3424]: E0325 01:31:48.647172 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.647252 kubelet[3424]: W0325 01:31:48.647244 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.647428 kubelet[3424]: E0325 01:31:48.647273 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.647666 kubelet[3424]: E0325 01:31:48.647629 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.647666 kubelet[3424]: W0325 01:31:48.647658 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.647801 kubelet[3424]: E0325 01:31:48.647682 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.648250 kubelet[3424]: E0325 01:31:48.648160 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.648250 kubelet[3424]: W0325 01:31:48.648242 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.648443 kubelet[3424]: E0325 01:31:48.648271 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.649004 kubelet[3424]: E0325 01:31:48.648878 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.649004 kubelet[3424]: W0325 01:31:48.648913 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.649004 kubelet[3424]: E0325 01:31:48.648942 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.650147 kubelet[3424]: E0325 01:31:48.650102 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.650147 kubelet[3424]: W0325 01:31:48.650140 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.650381 kubelet[3424]: E0325 01:31:48.650172 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.650997 kubelet[3424]: E0325 01:31:48.650948 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.650997 kubelet[3424]: W0325 01:31:48.650985 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.651277 kubelet[3424]: E0325 01:31:48.651015 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.651796 kubelet[3424]: E0325 01:31:48.651750 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.651796 kubelet[3424]: W0325 01:31:48.651786 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.652202 kubelet[3424]: E0325 01:31:48.651816 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.652848 kubelet[3424]: E0325 01:31:48.652576 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.652848 kubelet[3424]: W0325 01:31:48.652610 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.652848 kubelet[3424]: E0325 01:31:48.652640 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.655438 kubelet[3424]: E0325 01:31:48.655383 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.655438 kubelet[3424]: W0325 01:31:48.655431 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.655657 kubelet[3424]: E0325 01:31:48.655465 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.656388 kubelet[3424]: E0325 01:31:48.656338 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.656388 kubelet[3424]: W0325 01:31:48.656377 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.656599 kubelet[3424]: E0325 01:31:48.656409 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.656844 kubelet[3424]: E0325 01:31:48.656809 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.656844 kubelet[3424]: W0325 01:31:48.656839 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.656976 kubelet[3424]: E0325 01:31:48.656864 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.658397 kubelet[3424]: E0325 01:31:48.658345 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.658397 kubelet[3424]: W0325 01:31:48.658385 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.658611 kubelet[3424]: E0325 01:31:48.658423 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.659048 kubelet[3424]: E0325 01:31:48.659008 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.659048 kubelet[3424]: W0325 01:31:48.659040 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.660518 kubelet[3424]: E0325 01:31:48.659067 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.660518 kubelet[3424]: I0325 01:31:48.659119 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7176352e-f194-4c71-92fb-e7f6c7227404-varrun\") pod \"csi-node-driver-wsw9h\" (UID: \"7176352e-f194-4c71-92fb-e7f6c7227404\") " pod="calico-system/csi-node-driver-wsw9h" Mar 25 01:31:48.660518 kubelet[3424]: E0325 01:31:48.659517 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.660518 kubelet[3424]: W0325 01:31:48.659538 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.660518 kubelet[3424]: E0325 01:31:48.659561 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.660518 kubelet[3424]: I0325 01:31:48.659598 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7176352e-f194-4c71-92fb-e7f6c7227404-socket-dir\") pod \"csi-node-driver-wsw9h\" (UID: \"7176352e-f194-4c71-92fb-e7f6c7227404\") " pod="calico-system/csi-node-driver-wsw9h" Mar 25 01:31:48.660862 kubelet[3424]: E0325 01:31:48.660668 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.660862 kubelet[3424]: W0325 01:31:48.660698 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.660862 kubelet[3424]: E0325 01:31:48.660733 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.660862 kubelet[3424]: I0325 01:31:48.660779 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plxs8\" (UniqueName: \"kubernetes.io/projected/7176352e-f194-4c71-92fb-e7f6c7227404-kube-api-access-plxs8\") pod \"csi-node-driver-wsw9h\" (UID: \"7176352e-f194-4c71-92fb-e7f6c7227404\") " pod="calico-system/csi-node-driver-wsw9h" Mar 25 01:31:48.662142 kubelet[3424]: E0325 01:31:48.662081 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.662142 kubelet[3424]: W0325 01:31:48.662113 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.662142 kubelet[3424]: E0325 01:31:48.662158 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.662142 kubelet[3424]: I0325 01:31:48.662225 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7176352e-f194-4c71-92fb-e7f6c7227404-kubelet-dir\") pod \"csi-node-driver-wsw9h\" (UID: \"7176352e-f194-4c71-92fb-e7f6c7227404\") " pod="calico-system/csi-node-driver-wsw9h" Mar 25 01:31:48.663612 kubelet[3424]: E0325 01:31:48.663563 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.664427 kubelet[3424]: W0325 01:31:48.664302 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.664427 kubelet[3424]: E0325 01:31:48.664381 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.664871 kubelet[3424]: E0325 01:31:48.664826 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.664871 kubelet[3424]: W0325 01:31:48.664865 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.665017 kubelet[3424]: E0325 01:31:48.664911 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.666806 kubelet[3424]: E0325 01:31:48.666758 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.666806 kubelet[3424]: W0325 01:31:48.666799 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.667085 kubelet[3424]: E0325 01:31:48.666847 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.668587 kubelet[3424]: E0325 01:31:48.668534 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.668587 kubelet[3424]: W0325 01:31:48.668576 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.668774 kubelet[3424]: E0325 01:31:48.668625 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.668774 kubelet[3424]: I0325 01:31:48.668679 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7176352e-f194-4c71-92fb-e7f6c7227404-registration-dir\") pod \"csi-node-driver-wsw9h\" (UID: \"7176352e-f194-4c71-92fb-e7f6c7227404\") " pod="calico-system/csi-node-driver-wsw9h" Mar 25 01:31:48.669207 kubelet[3424]: E0325 01:31:48.669153 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.669318 kubelet[3424]: W0325 01:31:48.669209 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.669318 kubelet[3424]: E0325 01:31:48.669254 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.670585 kubelet[3424]: E0325 01:31:48.670534 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.670585 kubelet[3424]: W0325 01:31:48.670574 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.670899 kubelet[3424]: E0325 01:31:48.670608 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.671135 kubelet[3424]: E0325 01:31:48.671096 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.671135 kubelet[3424]: W0325 01:31:48.671129 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.671315 kubelet[3424]: E0325 01:31:48.671171 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.672678 kubelet[3424]: E0325 01:31:48.672626 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.672678 kubelet[3424]: W0325 01:31:48.672666 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.672969 kubelet[3424]: E0325 01:31:48.672700 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.673205 kubelet[3424]: E0325 01:31:48.673156 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.673297 kubelet[3424]: W0325 01:31:48.673208 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.673297 kubelet[3424]: E0325 01:31:48.673237 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.674564 kubelet[3424]: E0325 01:31:48.674511 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.674564 kubelet[3424]: W0325 01:31:48.674551 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.675084 kubelet[3424]: E0325 01:31:48.674585 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.675084 kubelet[3424]: E0325 01:31:48.675051 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.675084 kubelet[3424]: W0325 01:31:48.675078 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.675333 kubelet[3424]: E0325 01:31:48.675109 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.777355 kubelet[3424]: E0325 01:31:48.776881 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.777355 kubelet[3424]: W0325 01:31:48.776916 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.777355 kubelet[3424]: E0325 01:31:48.776947 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.778677 kubelet[3424]: E0325 01:31:48.778448 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.778677 kubelet[3424]: W0325 01:31:48.778500 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.778677 kubelet[3424]: E0325 01:31:48.778553 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.779952 kubelet[3424]: E0325 01:31:48.779899 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.779952 kubelet[3424]: W0325 01:31:48.779939 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.780449 kubelet[3424]: E0325 01:31:48.779988 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.781531 kubelet[3424]: E0325 01:31:48.781465 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.781531 kubelet[3424]: W0325 01:31:48.781504 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.781862 kubelet[3424]: E0325 01:31:48.781662 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.783053 kubelet[3424]: E0325 01:31:48.783001 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.783053 kubelet[3424]: W0325 01:31:48.783041 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.783464 kubelet[3424]: E0325 01:31:48.783218 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.785158 kubelet[3424]: E0325 01:31:48.785098 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.785158 kubelet[3424]: W0325 01:31:48.785147 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.785840 kubelet[3424]: E0325 01:31:48.785614 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.786315 kubelet[3424]: E0325 01:31:48.786074 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.786315 kubelet[3424]: W0325 01:31:48.786106 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.788482 kubelet[3424]: E0325 01:31:48.788234 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.789209 kubelet[3424]: E0325 01:31:48.788926 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.789209 kubelet[3424]: W0325 01:31:48.788958 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.789608 kubelet[3424]: E0325 01:31:48.789505 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.789838 kubelet[3424]: E0325 01:31:48.789810 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.790115 kubelet[3424]: W0325 01:31:48.789971 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.790541 kubelet[3424]: E0325 01:31:48.790349 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.791089 kubelet[3424]: E0325 01:31:48.790813 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.791089 kubelet[3424]: W0325 01:31:48.790844 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.791355 kubelet[3424]: E0325 01:31:48.791323 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.791724 kubelet[3424]: E0325 01:31:48.791580 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.791724 kubelet[3424]: W0325 01:31:48.791608 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.792165 kubelet[3424]: E0325 01:31:48.791970 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.792714 kubelet[3424]: E0325 01:31:48.792481 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.792714 kubelet[3424]: W0325 01:31:48.792513 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.792937 kubelet[3424]: E0325 01:31:48.792898 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.794443 kubelet[3424]: E0325 01:31:48.793403 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.794443 kubelet[3424]: W0325 01:31:48.793433 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.794443 kubelet[3424]: E0325 01:31:48.793899 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.795499 kubelet[3424]: E0325 01:31:48.795273 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.795499 kubelet[3424]: W0325 01:31:48.795307 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.795499 kubelet[3424]: E0325 01:31:48.795371 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.796786 kubelet[3424]: E0325 01:31:48.796633 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.796786 kubelet[3424]: W0325 01:31:48.796669 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.796786 kubelet[3424]: E0325 01:31:48.796730 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.797977 kubelet[3424]: E0325 01:31:48.797469 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.797977 kubelet[3424]: W0325 01:31:48.797500 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.797977 kubelet[3424]: E0325 01:31:48.797731 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.800624 kubelet[3424]: E0325 01:31:48.800369 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.800624 kubelet[3424]: W0325 01:31:48.800409 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.800624 kubelet[3424]: E0325 01:31:48.800485 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.801564 kubelet[3424]: E0325 01:31:48.801359 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.801564 kubelet[3424]: W0325 01:31:48.801391 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.801564 kubelet[3424]: E0325 01:31:48.801463 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.803937 kubelet[3424]: E0325 01:31:48.803673 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.803937 kubelet[3424]: W0325 01:31:48.803710 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.803937 kubelet[3424]: E0325 01:31:48.803779 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.804540 kubelet[3424]: E0325 01:31:48.804415 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.804540 kubelet[3424]: W0325 01:31:48.804445 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.804540 kubelet[3424]: E0325 01:31:48.804501 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.804955 kubelet[3424]: E0325 01:31:48.804913 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.804955 kubelet[3424]: W0325 01:31:48.804947 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.805462 kubelet[3424]: E0325 01:31:48.805170 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.806583 kubelet[3424]: E0325 01:31:48.806528 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.806583 kubelet[3424]: W0325 01:31:48.806569 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.806960 kubelet[3424]: E0325 01:31:48.806639 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.807464 kubelet[3424]: E0325 01:31:48.807430 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.807841 kubelet[3424]: W0325 01:31:48.807625 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.807841 kubelet[3424]: E0325 01:31:48.807721 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.809493 kubelet[3424]: E0325 01:31:48.808871 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.809493 kubelet[3424]: W0325 01:31:48.808909 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.809493 kubelet[3424]: E0325 01:31:48.808959 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:48.810820 kubelet[3424]: E0325 01:31:48.810688 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:48.810820 kubelet[3424]: W0325 01:31:48.810725 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:48.810820 kubelet[3424]: E0325 01:31:48.810758 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.058640 kubelet[3424]: E0325 01:31:49.058156 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.058640 kubelet[3424]: W0325 01:31:49.058217 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.058640 kubelet[3424]: E0325 01:31:49.058255 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.245083 kubelet[3424]: E0325 01:31:49.244636 3424 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:31:49.245083 kubelet[3424]: E0325 01:31:49.244761 3424 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/639f69a1-50d6-482c-b6a2-a6da1f050cc4-tigera-ca-bundle podName:639f69a1-50d6-482c-b6a2-a6da1f050cc4 nodeName:}" failed. No retries permitted until 2025-03-25 01:31:49.744730618 +0000 UTC m=+20.641163098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/639f69a1-50d6-482c-b6a2-a6da1f050cc4-tigera-ca-bundle") pod "calico-typha-6bb86d8ff5-92gzk" (UID: "639f69a1-50d6-482c-b6a2-a6da1f050cc4") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:31:49.264364 kubelet[3424]: E0325 01:31:49.264285 3424 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:31:49.264364 kubelet[3424]: E0325 01:31:49.264341 3424 projected.go:194] Error preparing data for projected volume kube-api-access-4qmpk for pod calico-system/calico-typha-6bb86d8ff5-92gzk: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:31:49.264585 kubelet[3424]: E0325 01:31:49.264431 3424 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/639f69a1-50d6-482c-b6a2-a6da1f050cc4-kube-api-access-4qmpk podName:639f69a1-50d6-482c-b6a2-a6da1f050cc4 nodeName:}" failed. No retries permitted until 2025-03-25 01:31:49.76440488 +0000 UTC m=+20.660837360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4qmpk" (UniqueName: "kubernetes.io/projected/639f69a1-50d6-482c-b6a2-a6da1f050cc4-kube-api-access-4qmpk") pod "calico-typha-6bb86d8ff5-92gzk" (UID: "639f69a1-50d6-482c-b6a2-a6da1f050cc4") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:31:49.297528 kubelet[3424]: E0325 01:31:49.297489 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.297904 kubelet[3424]: W0325 01:31:49.297673 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.297904 kubelet[3424]: E0325 01:31:49.297713 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.298247 kubelet[3424]: E0325 01:31:49.298222 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.298448 kubelet[3424]: W0325 01:31:49.298342 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.298448 kubelet[3424]: E0325 01:31:49.298374 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.375339 kubelet[3424]: E0325 01:31:49.374143 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.375663 kubelet[3424]: W0325 01:31:49.375507 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.375663 kubelet[3424]: E0325 01:31:49.375560 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.386485 kubelet[3424]: E0325 01:31:49.386244 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.386485 kubelet[3424]: W0325 01:31:49.386309 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.386485 kubelet[3424]: E0325 01:31:49.386354 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.400146 kubelet[3424]: E0325 01:31:49.399934 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.400146 kubelet[3424]: W0325 01:31:49.399968 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.400146 kubelet[3424]: E0325 01:31:49.399998 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.400638 kubelet[3424]: E0325 01:31:49.400425 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.400638 kubelet[3424]: W0325 01:31:49.400446 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.400638 kubelet[3424]: E0325 01:31:49.400488 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.429820 kubelet[3424]: E0325 01:31:49.429680 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.429820 kubelet[3424]: W0325 01:31:49.429721 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.429820 kubelet[3424]: E0325 01:31:49.429758 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.501839 kubelet[3424]: E0325 01:31:49.501776 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.501839 kubelet[3424]: W0325 01:31:49.501814 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.501839 kubelet[3424]: E0325 01:31:49.501847 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.502636 kubelet[3424]: E0325 01:31:49.502262 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.502636 kubelet[3424]: W0325 01:31:49.502284 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.502636 kubelet[3424]: E0325 01:31:49.502308 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.505558 containerd[1946]: time="2025-03-25T01:31:49.505489078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6cfvs,Uid:40826026-4ad9-4c82-beae-ff94b5d3fb06,Namespace:calico-system,Attempt:0,}" Mar 25 01:31:49.556681 containerd[1946]: time="2025-03-25T01:31:49.555907452Z" level=info msg="connecting to shim a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379" address="unix:///run/containerd/s/01a4e72a53269a09dd05d8d0fb362354aa471a3955bda8dd34f57d432bb949cd" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:31:49.606268 kubelet[3424]: E0325 01:31:49.605093 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.606268 kubelet[3424]: W0325 01:31:49.605339 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.606268 kubelet[3424]: E0325 01:31:49.605516 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.607196 kubelet[3424]: E0325 01:31:49.606843 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.607354 kubelet[3424]: W0325 01:31:49.607172 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.607354 kubelet[3424]: E0325 01:31:49.607268 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.633617 systemd[1]: Started cri-containerd-a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379.scope - libcontainer container a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379. Mar 25 01:31:49.709247 kubelet[3424]: E0325 01:31:49.709086 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.709247 kubelet[3424]: W0325 01:31:49.709124 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.709247 kubelet[3424]: E0325 01:31:49.709205 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.710841 kubelet[3424]: E0325 01:31:49.710781 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.710841 kubelet[3424]: W0325 01:31:49.710838 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.711020 kubelet[3424]: E0325 01:31:49.710871 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.721378 containerd[1946]: time="2025-03-25T01:31:49.720636215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6cfvs,Uid:40826026-4ad9-4c82-beae-ff94b5d3fb06,Namespace:calico-system,Attempt:0,} returns sandbox id \"a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379\"" Mar 25 01:31:49.732014 containerd[1946]: time="2025-03-25T01:31:49.731958599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:31:49.813052 kubelet[3424]: E0325 01:31:49.812970 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.813052 kubelet[3424]: W0325 01:31:49.813005 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.814021 kubelet[3424]: E0325 01:31:49.813704 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.815162 kubelet[3424]: E0325 01:31:49.815012 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.815162 kubelet[3424]: W0325 01:31:49.815232 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.815983 kubelet[3424]: E0325 01:31:49.815284 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.815983 kubelet[3424]: E0325 01:31:49.815948 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.815983 kubelet[3424]: W0325 01:31:49.815974 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.816238 kubelet[3424]: E0325 01:31:49.816014 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.817335 kubelet[3424]: E0325 01:31:49.816673 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.817335 kubelet[3424]: W0325 01:31:49.816792 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.817335 kubelet[3424]: E0325 01:31:49.816829 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.818414 kubelet[3424]: E0325 01:31:49.818332 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.818967 kubelet[3424]: W0325 01:31:49.818520 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.819285 kubelet[3424]: E0325 01:31:49.819107 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.820289 kubelet[3424]: E0325 01:31:49.820051 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.820289 kubelet[3424]: W0325 01:31:49.820169 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.820503 kubelet[3424]: E0325 01:31:49.820305 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.821089 kubelet[3424]: E0325 01:31:49.821053 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.821207 kubelet[3424]: W0325 01:31:49.821087 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.821375 kubelet[3424]: E0325 01:31:49.821312 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.821580 kubelet[3424]: E0325 01:31:49.821553 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.821655 kubelet[3424]: W0325 01:31:49.821580 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.821655 kubelet[3424]: E0325 01:31:49.821615 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.822751 kubelet[3424]: E0325 01:31:49.822492 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.822751 kubelet[3424]: W0325 01:31:49.822535 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.822751 kubelet[3424]: E0325 01:31:49.822567 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.823771 kubelet[3424]: E0325 01:31:49.823337 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.823771 kubelet[3424]: W0325 01:31:49.823367 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.823771 kubelet[3424]: E0325 01:31:49.823396 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.824010 kubelet[3424]: E0325 01:31:49.823817 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.824010 kubelet[3424]: W0325 01:31:49.823837 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.824010 kubelet[3424]: E0325 01:31:49.823860 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.835762 kubelet[3424]: E0325 01:31:49.835705 3424 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:31:49.835762 kubelet[3424]: W0325 01:31:49.835754 3424 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:31:49.835965 kubelet[3424]: E0325 01:31:49.835790 3424 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:31:49.885513 containerd[1946]: time="2025-03-25T01:31:49.885226376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6bb86d8ff5-92gzk,Uid:639f69a1-50d6-482c-b6a2-a6da1f050cc4,Namespace:calico-system,Attempt:0,}" Mar 25 01:31:49.940166 containerd[1946]: time="2025-03-25T01:31:49.940081319Z" level=info msg="connecting to shim 87f16fde3d9d57797cbc9785652d5439d7c9d507d9aa6a6ff98e8d5398e892e5" address="unix:///run/containerd/s/c3d8265e03560752ed4bc20b6339b1e68bd594d71ff6241c7df95ba29ac92e58" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:31:50.008520 systemd[1]: Started cri-containerd-87f16fde3d9d57797cbc9785652d5439d7c9d507d9aa6a6ff98e8d5398e892e5.scope - libcontainer container 87f16fde3d9d57797cbc9785652d5439d7c9d507d9aa6a6ff98e8d5398e892e5. Mar 25 01:31:50.101409 containerd[1946]: time="2025-03-25T01:31:50.101245705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6bb86d8ff5-92gzk,Uid:639f69a1-50d6-482c-b6a2-a6da1f050cc4,Namespace:calico-system,Attempt:0,} returns sandbox id \"87f16fde3d9d57797cbc9785652d5439d7c9d507d9aa6a6ff98e8d5398e892e5\"" Mar 25 01:31:50.346986 kubelet[3424]: E0325 01:31:50.346814 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:31:51.301045 containerd[1946]: time="2025-03-25T01:31:51.300637360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:51.304368 containerd[1946]: time="2025-03-25T01:31:51.304209833Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 25 01:31:51.306679 containerd[1946]: time="2025-03-25T01:31:51.306527578Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:51.311610 containerd[1946]: time="2025-03-25T01:31:51.311497615Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:51.313886 containerd[1946]: time="2025-03-25T01:31:51.313693297Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.581670889s" Mar 25 01:31:51.313886 containerd[1946]: time="2025-03-25T01:31:51.313758221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 25 01:31:51.319661 containerd[1946]: time="2025-03-25T01:31:51.318342903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:31:51.323553 containerd[1946]: time="2025-03-25T01:31:51.321659964Z" level=info msg="CreateContainer within sandbox \"a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:31:51.343808 containerd[1946]: time="2025-03-25T01:31:51.343669769Z" level=info msg="Container cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:51.374386 containerd[1946]: time="2025-03-25T01:31:51.374318865Z" level=info msg="CreateContainer within sandbox \"a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a\"" Mar 25 01:31:51.377214 containerd[1946]: time="2025-03-25T01:31:51.375522523Z" level=info msg="StartContainer for \"cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a\"" Mar 25 01:31:51.378774 containerd[1946]: time="2025-03-25T01:31:51.378718720Z" level=info msg="connecting to shim cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a" address="unix:///run/containerd/s/01a4e72a53269a09dd05d8d0fb362354aa471a3955bda8dd34f57d432bb949cd" protocol=ttrpc version=3 Mar 25 01:31:51.435522 systemd[1]: Started cri-containerd-cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a.scope - libcontainer container cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a. Mar 25 01:31:51.550075 containerd[1946]: time="2025-03-25T01:31:51.549836663Z" level=info msg="StartContainer for \"cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a\" returns successfully" Mar 25 01:31:51.575301 systemd[1]: cri-containerd-cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a.scope: Deactivated successfully. Mar 25 01:31:51.585577 containerd[1946]: time="2025-03-25T01:31:51.585433704Z" level=info msg="received exit event container_id:\"cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a\" id:\"cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a\" pid:4042 exited_at:{seconds:1742866311 nanos:583853074}" Mar 25 01:31:51.588624 containerd[1946]: time="2025-03-25T01:31:51.588529019Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a\" id:\"cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a\" pid:4042 exited_at:{seconds:1742866311 nanos:583853074}" Mar 25 01:31:51.659322 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cf8cb9d2b39b3e984a2de7143caa2aec83cfec14fd9a1a54e8a6af4a337a387a-rootfs.mount: Deactivated successfully. Mar 25 01:31:52.347286 kubelet[3424]: E0325 01:31:52.346583 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:31:54.182220 containerd[1946]: time="2025-03-25T01:31:54.180959407Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:54.185007 containerd[1946]: time="2025-03-25T01:31:54.184840330Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 25 01:31:54.188020 containerd[1946]: time="2025-03-25T01:31:54.187917655Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:54.196242 containerd[1946]: time="2025-03-25T01:31:54.195510600Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:54.196902 containerd[1946]: time="2025-03-25T01:31:54.196855691Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 2.878447397s" Mar 25 01:31:54.197081 containerd[1946]: time="2025-03-25T01:31:54.197050438Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 25 01:31:54.200349 containerd[1946]: time="2025-03-25T01:31:54.200267996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:31:54.229797 containerd[1946]: time="2025-03-25T01:31:54.229522742Z" level=info msg="CreateContainer within sandbox \"87f16fde3d9d57797cbc9785652d5439d7c9d507d9aa6a6ff98e8d5398e892e5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:31:54.246540 containerd[1946]: time="2025-03-25T01:31:54.242462517Z" level=info msg="Container 2092b3586ed99349d10a091eaabb0a5567263e59204974e4f85d89d135b31029: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:54.258452 containerd[1946]: time="2025-03-25T01:31:54.258380690Z" level=info msg="CreateContainer within sandbox \"87f16fde3d9d57797cbc9785652d5439d7c9d507d9aa6a6ff98e8d5398e892e5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2092b3586ed99349d10a091eaabb0a5567263e59204974e4f85d89d135b31029\"" Mar 25 01:31:54.259345 containerd[1946]: time="2025-03-25T01:31:54.259282015Z" level=info msg="StartContainer for \"2092b3586ed99349d10a091eaabb0a5567263e59204974e4f85d89d135b31029\"" Mar 25 01:31:54.261572 containerd[1946]: time="2025-03-25T01:31:54.261376023Z" level=info msg="connecting to shim 2092b3586ed99349d10a091eaabb0a5567263e59204974e4f85d89d135b31029" address="unix:///run/containerd/s/c3d8265e03560752ed4bc20b6339b1e68bd594d71ff6241c7df95ba29ac92e58" protocol=ttrpc version=3 Mar 25 01:31:54.300621 systemd[1]: Started cri-containerd-2092b3586ed99349d10a091eaabb0a5567263e59204974e4f85d89d135b31029.scope - libcontainer container 2092b3586ed99349d10a091eaabb0a5567263e59204974e4f85d89d135b31029. Mar 25 01:31:54.346854 kubelet[3424]: E0325 01:31:54.346752 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:31:54.400207 containerd[1946]: time="2025-03-25T01:31:54.397937957Z" level=info msg="StartContainer for \"2092b3586ed99349d10a091eaabb0a5567263e59204974e4f85d89d135b31029\" returns successfully" Mar 25 01:31:54.629236 kubelet[3424]: I0325 01:31:54.628054 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6bb86d8ff5-92gzk" podStartSLOduration=2.532514647 podStartE2EDuration="6.628030553s" podCreationTimestamp="2025-03-25 01:31:48 +0000 UTC" firstStartedPulling="2025-03-25 01:31:50.104220528 +0000 UTC m=+21.000653020" lastFinishedPulling="2025-03-25 01:31:54.199736446 +0000 UTC m=+25.096168926" observedRunningTime="2025-03-25 01:31:54.62742945 +0000 UTC m=+25.523861978" watchObservedRunningTime="2025-03-25 01:31:54.628030553 +0000 UTC m=+25.524463045" Mar 25 01:31:55.604301 kubelet[3424]: I0325 01:31:55.603976 3424 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:31:56.348680 kubelet[3424]: E0325 01:31:56.348622 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:31:58.347460 kubelet[3424]: E0325 01:31:58.347367 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:31:59.122146 containerd[1946]: time="2025-03-25T01:31:59.122088250Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:59.124017 containerd[1946]: time="2025-03-25T01:31:59.123914636Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 25 01:31:59.127249 containerd[1946]: time="2025-03-25T01:31:59.125973430Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:59.138496 containerd[1946]: time="2025-03-25T01:31:59.138435544Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:31:59.140370 containerd[1946]: time="2025-03-25T01:31:59.140312617Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 4.939656351s" Mar 25 01:31:59.140587 containerd[1946]: time="2025-03-25T01:31:59.140552209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 25 01:31:59.145722 containerd[1946]: time="2025-03-25T01:31:59.145621401Z" level=info msg="CreateContainer within sandbox \"a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:31:59.165210 containerd[1946]: time="2025-03-25T01:31:59.163417511Z" level=info msg="Container 618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:31:59.170232 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2480991940.mount: Deactivated successfully. Mar 25 01:31:59.185529 containerd[1946]: time="2025-03-25T01:31:59.185444360Z" level=info msg="CreateContainer within sandbox \"a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406\"" Mar 25 01:31:59.186614 containerd[1946]: time="2025-03-25T01:31:59.186469044Z" level=info msg="StartContainer for \"618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406\"" Mar 25 01:31:59.191291 containerd[1946]: time="2025-03-25T01:31:59.191212023Z" level=info msg="connecting to shim 618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406" address="unix:///run/containerd/s/01a4e72a53269a09dd05d8d0fb362354aa471a3955bda8dd34f57d432bb949cd" protocol=ttrpc version=3 Mar 25 01:31:59.238633 systemd[1]: Started cri-containerd-618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406.scope - libcontainer container 618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406. Mar 25 01:31:59.335419 containerd[1946]: time="2025-03-25T01:31:59.335324923Z" level=info msg="StartContainer for \"618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406\" returns successfully" Mar 25 01:32:00.290769 containerd[1946]: time="2025-03-25T01:32:00.290674012Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:32:00.294580 systemd[1]: cri-containerd-618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406.scope: Deactivated successfully. Mar 25 01:32:00.295859 systemd[1]: cri-containerd-618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406.scope: Consumed 957ms CPU time, 170.4M memory peak, 150.3M written to disk. Mar 25 01:32:00.300122 containerd[1946]: time="2025-03-25T01:32:00.300065746Z" level=info msg="received exit event container_id:\"618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406\" id:\"618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406\" pid:4142 exited_at:{seconds:1742866320 nanos:299708841}" Mar 25 01:32:00.300520 containerd[1946]: time="2025-03-25T01:32:00.300461404Z" level=info msg="TaskExit event in podsandbox handler container_id:\"618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406\" id:\"618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406\" pid:4142 exited_at:{seconds:1742866320 nanos:299708841}" Mar 25 01:32:00.342917 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-618cab471bb36dde5aa42e53e31852134367a5dfab5a042f42fc3a306ccc5406-rootfs.mount: Deactivated successfully. Mar 25 01:32:00.349137 kubelet[3424]: E0325 01:32:00.346367 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:32:00.355773 kubelet[3424]: I0325 01:32:00.355542 3424 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 25 01:32:00.473037 kubelet[3424]: I0325 01:32:00.472926 3424 status_manager.go:890] "Failed to get status for pod" podUID="4ae8b6c3-7961-411a-b8a7-db8f68e68f33" pod="kube-system/coredns-668d6bf9bc-cpcqx" err="pods \"coredns-668d6bf9bc-cpcqx\" is forbidden: User \"system:node:ip-172-31-28-242\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-172-31-28-242' and this object" Mar 25 01:32:00.473944 systemd[1]: Created slice kubepods-burstable-pod4ae8b6c3_7961_411a_b8a7_db8f68e68f33.slice - libcontainer container kubepods-burstable-pod4ae8b6c3_7961_411a_b8a7_db8f68e68f33.slice. Mar 25 01:32:00.482319 kubelet[3424]: W0325 01:32:00.479722 3424 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ip-172-31-28-242" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ip-172-31-28-242' and this object Mar 25 01:32:00.482319 kubelet[3424]: E0325 01:32:00.479783 3424 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ip-172-31-28-242\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-172-31-28-242' and this object" logger="UnhandledError" Mar 25 01:32:00.496791 systemd[1]: Created slice kubepods-besteffort-pode21ef940_4b30_4e3d_ac52_96c893706884.slice - libcontainer container kubepods-besteffort-pode21ef940_4b30_4e3d_ac52_96c893706884.slice. Mar 25 01:32:00.499842 kubelet[3424]: I0325 01:32:00.498046 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxtf5\" (UniqueName: \"kubernetes.io/projected/e21ef940-4b30-4e3d-ac52-96c893706884-kube-api-access-dxtf5\") pod \"calico-kube-controllers-755766d89b-p6jm5\" (UID: \"e21ef940-4b30-4e3d-ac52-96c893706884\") " pod="calico-system/calico-kube-controllers-755766d89b-p6jm5" Mar 25 01:32:00.499842 kubelet[3424]: I0325 01:32:00.498112 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5kgb\" (UniqueName: \"kubernetes.io/projected/9cd52328-d1d7-42d0-962c-7cd987215243-kube-api-access-p5kgb\") pod \"coredns-668d6bf9bc-tp7ll\" (UID: \"9cd52328-d1d7-42d0-962c-7cd987215243\") " pod="kube-system/coredns-668d6bf9bc-tp7ll" Mar 25 01:32:00.499842 kubelet[3424]: I0325 01:32:00.498153 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cd52328-d1d7-42d0-962c-7cd987215243-config-volume\") pod \"coredns-668d6bf9bc-tp7ll\" (UID: \"9cd52328-d1d7-42d0-962c-7cd987215243\") " pod="kube-system/coredns-668d6bf9bc-tp7ll" Mar 25 01:32:00.499842 kubelet[3424]: I0325 01:32:00.498256 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ae8b6c3-7961-411a-b8a7-db8f68e68f33-config-volume\") pod \"coredns-668d6bf9bc-cpcqx\" (UID: \"4ae8b6c3-7961-411a-b8a7-db8f68e68f33\") " pod="kube-system/coredns-668d6bf9bc-cpcqx" Mar 25 01:32:00.499842 kubelet[3424]: I0325 01:32:00.498309 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e21ef940-4b30-4e3d-ac52-96c893706884-tigera-ca-bundle\") pod \"calico-kube-controllers-755766d89b-p6jm5\" (UID: \"e21ef940-4b30-4e3d-ac52-96c893706884\") " pod="calico-system/calico-kube-controllers-755766d89b-p6jm5" Mar 25 01:32:00.500221 kubelet[3424]: I0325 01:32:00.498351 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwx46\" (UniqueName: \"kubernetes.io/projected/4ae8b6c3-7961-411a-b8a7-db8f68e68f33-kube-api-access-wwx46\") pod \"coredns-668d6bf9bc-cpcqx\" (UID: \"4ae8b6c3-7961-411a-b8a7-db8f68e68f33\") " pod="kube-system/coredns-668d6bf9bc-cpcqx" Mar 25 01:32:00.519191 systemd[1]: Created slice kubepods-burstable-pod9cd52328_d1d7_42d0_962c_7cd987215243.slice - libcontainer container kubepods-burstable-pod9cd52328_d1d7_42d0_962c_7cd987215243.slice. Mar 25 01:32:00.554894 systemd[1]: Created slice kubepods-besteffort-poddadf09ef_70e2_4836_a7ba_e5b0a0c652dd.slice - libcontainer container kubepods-besteffort-poddadf09ef_70e2_4836_a7ba_e5b0a0c652dd.slice. Mar 25 01:32:00.575957 systemd[1]: Created slice kubepods-besteffort-podcd448025_947d_4e3d_9cb5_a93ca6dbbf1e.slice - libcontainer container kubepods-besteffort-podcd448025_947d_4e3d_9cb5_a93ca6dbbf1e.slice. Mar 25 01:32:00.588935 kubelet[3424]: W0325 01:32:00.588818 3424 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ip-172-31-28-242" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ip-172-31-28-242' and this object Mar 25 01:32:00.588935 kubelet[3424]: E0325 01:32:00.588879 3424 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ip-172-31-28-242\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ip-172-31-28-242' and this object" logger="UnhandledError" Mar 25 01:32:00.599076 kubelet[3424]: I0325 01:32:00.598924 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x87xq\" (UniqueName: \"kubernetes.io/projected/dadf09ef-70e2-4836-a7ba-e5b0a0c652dd-kube-api-access-x87xq\") pod \"calico-apiserver-756bdfdf4c-trthd\" (UID: \"dadf09ef-70e2-4836-a7ba-e5b0a0c652dd\") " pod="calico-apiserver/calico-apiserver-756bdfdf4c-trthd" Mar 25 01:32:00.599346 kubelet[3424]: I0325 01:32:00.599022 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cd448025-947d-4e3d-9cb5-a93ca6dbbf1e-calico-apiserver-certs\") pod \"calico-apiserver-756bdfdf4c-tf9gh\" (UID: \"cd448025-947d-4e3d-9cb5-a93ca6dbbf1e\") " pod="calico-apiserver/calico-apiserver-756bdfdf4c-tf9gh" Mar 25 01:32:00.599719 kubelet[3424]: I0325 01:32:00.599583 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dadf09ef-70e2-4836-a7ba-e5b0a0c652dd-calico-apiserver-certs\") pod \"calico-apiserver-756bdfdf4c-trthd\" (UID: \"dadf09ef-70e2-4836-a7ba-e5b0a0c652dd\") " pod="calico-apiserver/calico-apiserver-756bdfdf4c-trthd" Mar 25 01:32:00.599719 kubelet[3424]: I0325 01:32:00.599659 3424 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s22m\" (UniqueName: \"kubernetes.io/projected/cd448025-947d-4e3d-9cb5-a93ca6dbbf1e-kube-api-access-5s22m\") pod \"calico-apiserver-756bdfdf4c-tf9gh\" (UID: \"cd448025-947d-4e3d-9cb5-a93ca6dbbf1e\") " pod="calico-apiserver/calico-apiserver-756bdfdf4c-tf9gh" Mar 25 01:32:00.808311 containerd[1946]: time="2025-03-25T01:32:00.808076499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-755766d89b-p6jm5,Uid:e21ef940-4b30-4e3d-ac52-96c893706884,Namespace:calico-system,Attempt:0,}" Mar 25 01:32:01.056590 containerd[1946]: time="2025-03-25T01:32:01.056461781Z" level=error msg="Failed to destroy network for sandbox \"084fad2d5b903eaa1246ca193e067e01cda488fb2cfe9a256a4cd506493f460e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:01.144823 containerd[1946]: time="2025-03-25T01:32:01.144314101Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-755766d89b-p6jm5,Uid:e21ef940-4b30-4e3d-ac52-96c893706884,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"084fad2d5b903eaa1246ca193e067e01cda488fb2cfe9a256a4cd506493f460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:01.145008 kubelet[3424]: E0325 01:32:01.144714 3424 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"084fad2d5b903eaa1246ca193e067e01cda488fb2cfe9a256a4cd506493f460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:01.145108 kubelet[3424]: E0325 01:32:01.145025 3424 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"084fad2d5b903eaa1246ca193e067e01cda488fb2cfe9a256a4cd506493f460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-755766d89b-p6jm5" Mar 25 01:32:01.145108 kubelet[3424]: E0325 01:32:01.145069 3424 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"084fad2d5b903eaa1246ca193e067e01cda488fb2cfe9a256a4cd506493f460e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-755766d89b-p6jm5" Mar 25 01:32:01.145283 kubelet[3424]: E0325 01:32:01.145144 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-755766d89b-p6jm5_calico-system(e21ef940-4b30-4e3d-ac52-96c893706884)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-755766d89b-p6jm5_calico-system(e21ef940-4b30-4e3d-ac52-96c893706884)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"084fad2d5b903eaa1246ca193e067e01cda488fb2cfe9a256a4cd506493f460e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-755766d89b-p6jm5" podUID="e21ef940-4b30-4e3d-ac52-96c893706884" Mar 25 01:32:01.600771 kubelet[3424]: E0325 01:32:01.600345 3424 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:32:01.600771 kubelet[3424]: E0325 01:32:01.600469 3424 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9cd52328-d1d7-42d0-962c-7cd987215243-config-volume podName:9cd52328-d1d7-42d0-962c-7cd987215243 nodeName:}" failed. No retries permitted until 2025-03-25 01:32:02.100424681 +0000 UTC m=+32.996857173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/9cd52328-d1d7-42d0-962c-7cd987215243-config-volume") pod "coredns-668d6bf9bc-tp7ll" (UID: "9cd52328-d1d7-42d0-962c-7cd987215243") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:32:01.600771 kubelet[3424]: E0325 01:32:01.600646 3424 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:32:01.600771 kubelet[3424]: E0325 01:32:01.600711 3424 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ae8b6c3-7961-411a-b8a7-db8f68e68f33-config-volume podName:4ae8b6c3-7961-411a-b8a7-db8f68e68f33 nodeName:}" failed. No retries permitted until 2025-03-25 01:32:02.100688861 +0000 UTC m=+32.997121341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/4ae8b6c3-7961-411a-b8a7-db8f68e68f33-config-volume") pod "coredns-668d6bf9bc-cpcqx" (UID: "4ae8b6c3-7961-411a-b8a7-db8f68e68f33") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:32:01.703248 kubelet[3424]: E0325 01:32:01.702603 3424 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Mar 25 01:32:01.703248 kubelet[3424]: E0325 01:32:01.702709 3424 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd448025-947d-4e3d-9cb5-a93ca6dbbf1e-calico-apiserver-certs podName:cd448025-947d-4e3d-9cb5-a93ca6dbbf1e nodeName:}" failed. No retries permitted until 2025-03-25 01:32:02.20268399 +0000 UTC m=+33.099116470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/cd448025-947d-4e3d-9cb5-a93ca6dbbf1e-calico-apiserver-certs") pod "calico-apiserver-756bdfdf4c-tf9gh" (UID: "cd448025-947d-4e3d-9cb5-a93ca6dbbf1e") : failed to sync secret cache: timed out waiting for the condition Mar 25 01:32:01.703248 kubelet[3424]: E0325 01:32:01.702967 3424 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Mar 25 01:32:01.703248 kubelet[3424]: E0325 01:32:01.703022 3424 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dadf09ef-70e2-4836-a7ba-e5b0a0c652dd-calico-apiserver-certs podName:dadf09ef-70e2-4836-a7ba-e5b0a0c652dd nodeName:}" failed. No retries permitted until 2025-03-25 01:32:02.203002035 +0000 UTC m=+33.099434527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/dadf09ef-70e2-4836-a7ba-e5b0a0c652dd-calico-apiserver-certs") pod "calico-apiserver-756bdfdf4c-trthd" (UID: "dadf09ef-70e2-4836-a7ba-e5b0a0c652dd") : failed to sync secret cache: timed out waiting for the condition Mar 25 01:32:02.287232 containerd[1946]: time="2025-03-25T01:32:02.287021345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cpcqx,Uid:4ae8b6c3-7961-411a-b8a7-db8f68e68f33,Namespace:kube-system,Attempt:0,}" Mar 25 01:32:02.328826 containerd[1946]: time="2025-03-25T01:32:02.328394492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tp7ll,Uid:9cd52328-d1d7-42d0-962c-7cd987215243,Namespace:kube-system,Attempt:0,}" Mar 25 01:32:02.368264 systemd[1]: Created slice kubepods-besteffort-pod7176352e_f194_4c71_92fb_e7f6c7227404.slice - libcontainer container kubepods-besteffort-pod7176352e_f194_4c71_92fb_e7f6c7227404.slice. Mar 25 01:32:02.372968 containerd[1946]: time="2025-03-25T01:32:02.372872610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-trthd,Uid:dadf09ef-70e2-4836-a7ba-e5b0a0c652dd,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:32:02.374971 containerd[1946]: time="2025-03-25T01:32:02.374904622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wsw9h,Uid:7176352e-f194-4c71-92fb-e7f6c7227404,Namespace:calico-system,Attempt:0,}" Mar 25 01:32:02.389598 containerd[1946]: time="2025-03-25T01:32:02.389544090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-tf9gh,Uid:cd448025-947d-4e3d-9cb5-a93ca6dbbf1e,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:32:02.566536 containerd[1946]: time="2025-03-25T01:32:02.566299765Z" level=error msg="Failed to destroy network for sandbox \"c72b5bb497d2654966473a246f9a276a2d41ba02afff66121999144acc82c593\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.572842 systemd[1]: run-netns-cni\x2dd7d6166e\x2d1271\x2d0948\x2df93c\x2dc6be238e3920.mount: Deactivated successfully. Mar 25 01:32:02.574433 containerd[1946]: time="2025-03-25T01:32:02.573790905Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cpcqx,Uid:4ae8b6c3-7961-411a-b8a7-db8f68e68f33,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72b5bb497d2654966473a246f9a276a2d41ba02afff66121999144acc82c593\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.574838 kubelet[3424]: E0325 01:32:02.574143 3424 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72b5bb497d2654966473a246f9a276a2d41ba02afff66121999144acc82c593\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.576738 kubelet[3424]: E0325 01:32:02.575335 3424 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72b5bb497d2654966473a246f9a276a2d41ba02afff66121999144acc82c593\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cpcqx" Mar 25 01:32:02.576738 kubelet[3424]: E0325 01:32:02.575416 3424 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72b5bb497d2654966473a246f9a276a2d41ba02afff66121999144acc82c593\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-cpcqx" Mar 25 01:32:02.576738 kubelet[3424]: E0325 01:32:02.575523 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-cpcqx_kube-system(4ae8b6c3-7961-411a-b8a7-db8f68e68f33)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-cpcqx_kube-system(4ae8b6c3-7961-411a-b8a7-db8f68e68f33)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c72b5bb497d2654966473a246f9a276a2d41ba02afff66121999144acc82c593\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-cpcqx" podUID="4ae8b6c3-7961-411a-b8a7-db8f68e68f33" Mar 25 01:32:02.647142 containerd[1946]: time="2025-03-25T01:32:02.646868153Z" level=error msg="Failed to destroy network for sandbox \"72bac6c7e08fee27f2ff49747f1f8bbd14d87451ec0f2f41fbf94263a1d15155\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.654627 systemd[1]: run-netns-cni\x2d86eb5ef4\x2d86ac\x2d8c72\x2d3fa0\x2d1cc59fa9440e.mount: Deactivated successfully. Mar 25 01:32:02.659256 containerd[1946]: time="2025-03-25T01:32:02.658404162Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tp7ll,Uid:9cd52328-d1d7-42d0-962c-7cd987215243,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72bac6c7e08fee27f2ff49747f1f8bbd14d87451ec0f2f41fbf94263a1d15155\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.664784 kubelet[3424]: E0325 01:32:02.664387 3424 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72bac6c7e08fee27f2ff49747f1f8bbd14d87451ec0f2f41fbf94263a1d15155\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.664784 kubelet[3424]: E0325 01:32:02.664464 3424 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72bac6c7e08fee27f2ff49747f1f8bbd14d87451ec0f2f41fbf94263a1d15155\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tp7ll" Mar 25 01:32:02.664784 kubelet[3424]: E0325 01:32:02.664497 3424 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72bac6c7e08fee27f2ff49747f1f8bbd14d87451ec0f2f41fbf94263a1d15155\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tp7ll" Mar 25 01:32:02.665893 kubelet[3424]: E0325 01:32:02.664556 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-tp7ll_kube-system(9cd52328-d1d7-42d0-962c-7cd987215243)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-tp7ll_kube-system(9cd52328-d1d7-42d0-962c-7cd987215243)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72bac6c7e08fee27f2ff49747f1f8bbd14d87451ec0f2f41fbf94263a1d15155\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-tp7ll" podUID="9cd52328-d1d7-42d0-962c-7cd987215243" Mar 25 01:32:02.666050 containerd[1946]: time="2025-03-25T01:32:02.665244532Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:32:02.714822 containerd[1946]: time="2025-03-25T01:32:02.714076844Z" level=error msg="Failed to destroy network for sandbox \"60e696ac822b9fea6152f0794301dbc81d708e6b53b9f3f200169d72b16066a5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.721865 containerd[1946]: time="2025-03-25T01:32:02.720330887Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wsw9h,Uid:7176352e-f194-4c71-92fb-e7f6c7227404,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e696ac822b9fea6152f0794301dbc81d708e6b53b9f3f200169d72b16066a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.724976 kubelet[3424]: E0325 01:32:02.724918 3424 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e696ac822b9fea6152f0794301dbc81d708e6b53b9f3f200169d72b16066a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.725579 kubelet[3424]: E0325 01:32:02.725533 3424 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e696ac822b9fea6152f0794301dbc81d708e6b53b9f3f200169d72b16066a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wsw9h" Mar 25 01:32:02.727683 kubelet[3424]: E0325 01:32:02.725763 3424 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e696ac822b9fea6152f0794301dbc81d708e6b53b9f3f200169d72b16066a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wsw9h" Mar 25 01:32:02.727683 kubelet[3424]: E0325 01:32:02.725982 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-wsw9h_calico-system(7176352e-f194-4c71-92fb-e7f6c7227404)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-wsw9h_calico-system(7176352e-f194-4c71-92fb-e7f6c7227404)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60e696ac822b9fea6152f0794301dbc81d708e6b53b9f3f200169d72b16066a5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-wsw9h" podUID="7176352e-f194-4c71-92fb-e7f6c7227404" Mar 25 01:32:02.750631 containerd[1946]: time="2025-03-25T01:32:02.750555647Z" level=error msg="Failed to destroy network for sandbox \"d990a357a6f85b4e6e8a1031ab1baa1e8e22a52ec883f2ff9c9a12f055507bdd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.754170 containerd[1946]: time="2025-03-25T01:32:02.753636006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-trthd,Uid:dadf09ef-70e2-4836-a7ba-e5b0a0c652dd,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d990a357a6f85b4e6e8a1031ab1baa1e8e22a52ec883f2ff9c9a12f055507bdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.754949 kubelet[3424]: E0325 01:32:02.754874 3424 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d990a357a6f85b4e6e8a1031ab1baa1e8e22a52ec883f2ff9c9a12f055507bdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.755136 kubelet[3424]: E0325 01:32:02.754971 3424 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d990a357a6f85b4e6e8a1031ab1baa1e8e22a52ec883f2ff9c9a12f055507bdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-756bdfdf4c-trthd" Mar 25 01:32:02.755136 kubelet[3424]: E0325 01:32:02.755009 3424 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d990a357a6f85b4e6e8a1031ab1baa1e8e22a52ec883f2ff9c9a12f055507bdd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-756bdfdf4c-trthd" Mar 25 01:32:02.755136 kubelet[3424]: E0325 01:32:02.755090 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-756bdfdf4c-trthd_calico-apiserver(dadf09ef-70e2-4836-a7ba-e5b0a0c652dd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-756bdfdf4c-trthd_calico-apiserver(dadf09ef-70e2-4836-a7ba-e5b0a0c652dd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d990a357a6f85b4e6e8a1031ab1baa1e8e22a52ec883f2ff9c9a12f055507bdd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-756bdfdf4c-trthd" podUID="dadf09ef-70e2-4836-a7ba-e5b0a0c652dd" Mar 25 01:32:02.757424 containerd[1946]: time="2025-03-25T01:32:02.757342308Z" level=error msg="Failed to destroy network for sandbox \"e49ef67eae37779b5a01bd283a47a8a4478d174a4591c7bf936ae2c1c4da3c96\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.760561 containerd[1946]: time="2025-03-25T01:32:02.760140101Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-tf9gh,Uid:cd448025-947d-4e3d-9cb5-a93ca6dbbf1e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e49ef67eae37779b5a01bd283a47a8a4478d174a4591c7bf936ae2c1c4da3c96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.761891 kubelet[3424]: E0325 01:32:02.761628 3424 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e49ef67eae37779b5a01bd283a47a8a4478d174a4591c7bf936ae2c1c4da3c96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:32:02.761891 kubelet[3424]: E0325 01:32:02.761722 3424 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e49ef67eae37779b5a01bd283a47a8a4478d174a4591c7bf936ae2c1c4da3c96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-756bdfdf4c-tf9gh" Mar 25 01:32:02.761891 kubelet[3424]: E0325 01:32:02.761759 3424 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e49ef67eae37779b5a01bd283a47a8a4478d174a4591c7bf936ae2c1c4da3c96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-756bdfdf4c-tf9gh" Mar 25 01:32:02.763352 kubelet[3424]: E0325 01:32:02.761821 3424 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-756bdfdf4c-tf9gh_calico-apiserver(cd448025-947d-4e3d-9cb5-a93ca6dbbf1e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-756bdfdf4c-tf9gh_calico-apiserver(cd448025-947d-4e3d-9cb5-a93ca6dbbf1e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e49ef67eae37779b5a01bd283a47a8a4478d174a4591c7bf936ae2c1c4da3c96\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-756bdfdf4c-tf9gh" podUID="cd448025-947d-4e3d-9cb5-a93ca6dbbf1e" Mar 25 01:32:03.385394 systemd[1]: run-netns-cni\x2ddb4f794c\x2d193a\x2dc127\x2de5fe\x2d36eef5e1a776.mount: Deactivated successfully. Mar 25 01:32:03.385582 systemd[1]: run-netns-cni\x2df85a91da\x2dfd93\x2d519c\x2d8f8e\x2d65283b43ecab.mount: Deactivated successfully. Mar 25 01:32:03.385747 systemd[1]: run-netns-cni\x2d478386db\x2dc5ab\x2d6458\x2d65ae\x2d2b143ccd592a.mount: Deactivated successfully. Mar 25 01:32:08.792741 kubelet[3424]: I0325 01:32:08.791277 3424 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:32:10.284570 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1335440284.mount: Deactivated successfully. Mar 25 01:32:10.346459 containerd[1946]: time="2025-03-25T01:32:10.346371306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:10.348572 containerd[1946]: time="2025-03-25T01:32:10.348350616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 25 01:32:10.349614 containerd[1946]: time="2025-03-25T01:32:10.349532733Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:10.353037 containerd[1946]: time="2025-03-25T01:32:10.352961350Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:10.354552 containerd[1946]: time="2025-03-25T01:32:10.354338765Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 7.689027282s" Mar 25 01:32:10.354552 containerd[1946]: time="2025-03-25T01:32:10.354402417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 25 01:32:10.388216 containerd[1946]: time="2025-03-25T01:32:10.385530037Z" level=info msg="CreateContainer within sandbox \"a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:32:10.402582 containerd[1946]: time="2025-03-25T01:32:10.402523124Z" level=info msg="Container 052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:10.436658 containerd[1946]: time="2025-03-25T01:32:10.435782978Z" level=info msg="CreateContainer within sandbox \"a8d68082db0c0c83e7df118d833a03f0fcce548bef1ca862dbe63a8052b19379\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\"" Mar 25 01:32:10.444804 containerd[1946]: time="2025-03-25T01:32:10.444749105Z" level=info msg="StartContainer for \"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\"" Mar 25 01:32:10.463646 containerd[1946]: time="2025-03-25T01:32:10.463465142Z" level=info msg="connecting to shim 052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0" address="unix:///run/containerd/s/01a4e72a53269a09dd05d8d0fb362354aa471a3955bda8dd34f57d432bb949cd" protocol=ttrpc version=3 Mar 25 01:32:10.505699 systemd[1]: Started sshd@7-172.31.28.242:22-147.75.109.163:37940.service - OpenSSH per-connection server daemon (147.75.109.163:37940). Mar 25 01:32:10.656549 systemd[1]: Started cri-containerd-052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0.scope - libcontainer container 052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0. Mar 25 01:32:10.781314 containerd[1946]: time="2025-03-25T01:32:10.781246893Z" level=info msg="StartContainer for \"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\" returns successfully" Mar 25 01:32:10.785778 sshd[4374]: Accepted publickey for core from 147.75.109.163 port 37940 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:10.792945 sshd-session[4374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:10.810905 systemd-logind[1929]: New session 8 of user core. Mar 25 01:32:10.816921 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:32:10.925781 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:32:10.925970 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:32:11.156339 sshd[4407]: Connection closed by 147.75.109.163 port 37940 Mar 25 01:32:11.157317 sshd-session[4374]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:11.162996 systemd-logind[1929]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:32:11.166992 systemd[1]: sshd@7-172.31.28.242:22-147.75.109.163:37940.service: Deactivated successfully. Mar 25 01:32:11.174578 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:32:11.179966 systemd-logind[1929]: Removed session 8. Mar 25 01:32:11.791225 kubelet[3424]: I0325 01:32:11.791047 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6cfvs" podStartSLOduration=3.163094821 podStartE2EDuration="23.791015879s" podCreationTimestamp="2025-03-25 01:31:48 +0000 UTC" firstStartedPulling="2025-03-25 01:31:49.728722138 +0000 UTC m=+20.625154630" lastFinishedPulling="2025-03-25 01:32:10.356643184 +0000 UTC m=+41.253075688" observedRunningTime="2025-03-25 01:32:11.786111208 +0000 UTC m=+42.682543736" watchObservedRunningTime="2025-03-25 01:32:11.791015879 +0000 UTC m=+42.687448371" Mar 25 01:32:11.877214 containerd[1946]: time="2025-03-25T01:32:11.877096769Z" level=info msg="TaskExit event in podsandbox handler container_id:\"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\" id:\"e8e8873246f83d33b683a01c6ef39b6511f1cb37061789e42d624f836f0cf58a\" pid:4464 exit_status:1 exited_at:{seconds:1742866331 nanos:876168853}" Mar 25 01:32:13.305161 containerd[1946]: time="2025-03-25T01:32:13.305085946Z" level=info msg="TaskExit event in podsandbox handler container_id:\"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\" id:\"f3e8ac207b1d51ef17758b04ab17aaaeb3795f24ac60fdae70ea4d1f91110801\" pid:4574 exit_status:1 exited_at:{seconds:1742866333 nanos:303909334}" Mar 25 01:32:13.350479 containerd[1946]: time="2025-03-25T01:32:13.348759062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-755766d89b-p6jm5,Uid:e21ef940-4b30-4e3d-ac52-96c893706884,Namespace:calico-system,Attempt:0,}" Mar 25 01:32:13.350479 containerd[1946]: time="2025-03-25T01:32:13.349231506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wsw9h,Uid:7176352e-f194-4c71-92fb-e7f6c7227404,Namespace:calico-system,Attempt:0,}" Mar 25 01:32:13.397230 kernel: bpftool[4632]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:32:13.804318 (udev-worker)[4413]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:32:13.804549 systemd-networkd[1793]: cali3d6662d496c: Link UP Mar 25 01:32:13.805859 systemd-networkd[1793]: cali3d6662d496c: Gained carrier Mar 25 01:32:13.856428 containerd[1946]: 2025-03-25 01:32:13.572 [INFO][4619] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0 calico-kube-controllers-755766d89b- calico-system e21ef940-4b30-4e3d-ac52-96c893706884 719 0 2025-03-25 01:31:48 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:755766d89b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-28-242 calico-kube-controllers-755766d89b-p6jm5 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3d6662d496c [] []}} ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-" Mar 25 01:32:13.856428 containerd[1946]: 2025-03-25 01:32:13.572 [INFO][4619] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" Mar 25 01:32:13.856428 containerd[1946]: 2025-03-25 01:32:13.693 [INFO][4653] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" HandleID="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Workload="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.720 [INFO][4653] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" HandleID="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Workload="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000102830), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-28-242", "pod":"calico-kube-controllers-755766d89b-p6jm5", "timestamp":"2025-03-25 01:32:13.693451943 +0000 UTC"}, Hostname:"ip-172-31-28-242", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.720 [INFO][4653] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.721 [INFO][4653] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.721 [INFO][4653] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-242' Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.726 [INFO][4653] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" host="ip-172-31-28-242" Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.735 [INFO][4653] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-28-242" Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.745 [INFO][4653] ipam/ipam.go 489: Trying affinity for 192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.748 [INFO][4653] ipam/ipam.go 155: Attempting to load block cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:13.856808 containerd[1946]: 2025-03-25 01:32:13.755 [INFO][4653] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:13.857375 containerd[1946]: 2025-03-25 01:32:13.755 [INFO][4653] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.6.64/26 handle="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" host="ip-172-31-28-242" Mar 25 01:32:13.857375 containerd[1946]: 2025-03-25 01:32:13.759 [INFO][4653] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3 Mar 25 01:32:13.857375 containerd[1946]: 2025-03-25 01:32:13.768 [INFO][4653] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.6.64/26 handle="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" host="ip-172-31-28-242" Mar 25 01:32:13.857375 containerd[1946]: 2025-03-25 01:32:13.778 [INFO][4653] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.6.65/26] block=192.168.6.64/26 handle="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" host="ip-172-31-28-242" Mar 25 01:32:13.857375 containerd[1946]: 2025-03-25 01:32:13.778 [INFO][4653] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.6.65/26] handle="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" host="ip-172-31-28-242" Mar 25 01:32:13.857375 containerd[1946]: 2025-03-25 01:32:13.778 [INFO][4653] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:32:13.857375 containerd[1946]: 2025-03-25 01:32:13.778 [INFO][4653] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.6.65/26] IPv6=[] ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" HandleID="k8s-pod-network.88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Workload="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" Mar 25 01:32:13.857732 containerd[1946]: 2025-03-25 01:32:13.786 [INFO][4619] cni-plugin/k8s.go 386: Populated endpoint ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0", GenerateName:"calico-kube-controllers-755766d89b-", Namespace:"calico-system", SelfLink:"", UID:"e21ef940-4b30-4e3d-ac52-96c893706884", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"755766d89b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"", Pod:"calico-kube-controllers-755766d89b-p6jm5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.6.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d6662d496c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:13.857920 containerd[1946]: 2025-03-25 01:32:13.787 [INFO][4619] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.6.65/32] ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" Mar 25 01:32:13.857920 containerd[1946]: 2025-03-25 01:32:13.787 [INFO][4619] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3d6662d496c ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" Mar 25 01:32:13.857920 containerd[1946]: 2025-03-25 01:32:13.805 [INFO][4619] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" Mar 25 01:32:13.861016 containerd[1946]: 2025-03-25 01:32:13.808 [INFO][4619] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0", GenerateName:"calico-kube-controllers-755766d89b-", Namespace:"calico-system", SelfLink:"", UID:"e21ef940-4b30-4e3d-ac52-96c893706884", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"755766d89b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3", Pod:"calico-kube-controllers-755766d89b-p6jm5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.6.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d6662d496c", MAC:"c2:51:ee:aa:c3:3d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:13.863156 containerd[1946]: 2025-03-25 01:32:13.844 [INFO][4619] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" Namespace="calico-system" Pod="calico-kube-controllers-755766d89b-p6jm5" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--kube--controllers--755766d89b--p6jm5-eth0" Mar 25 01:32:13.941287 systemd-networkd[1793]: cali0f78a08235f: Link UP Mar 25 01:32:13.946952 systemd-networkd[1793]: cali0f78a08235f: Gained carrier Mar 25 01:32:14.004620 containerd[1946]: 2025-03-25 01:32:13.571 [INFO][4622] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0 csi-node-driver- calico-system 7176352e-f194-4c71-92fb-e7f6c7227404 636 0 2025-03-25 01:31:48 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-28-242 csi-node-driver-wsw9h eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0f78a08235f [] []}} ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-" Mar 25 01:32:14.004620 containerd[1946]: 2025-03-25 01:32:13.572 [INFO][4622] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" Mar 25 01:32:14.004620 containerd[1946]: 2025-03-25 01:32:13.697 [INFO][4647] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" HandleID="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Workload="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.728 [INFO][4647] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" HandleID="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Workload="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c8d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-28-242", "pod":"csi-node-driver-wsw9h", "timestamp":"2025-03-25 01:32:13.697530646 +0000 UTC"}, Hostname:"ip-172-31-28-242", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.728 [INFO][4647] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.779 [INFO][4647] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.779 [INFO][4647] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-242' Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.827 [INFO][4647] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" host="ip-172-31-28-242" Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.847 [INFO][4647] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-28-242" Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.871 [INFO][4647] ipam/ipam.go 489: Trying affinity for 192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.880 [INFO][4647] ipam/ipam.go 155: Attempting to load block cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.888 [INFO][4647] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:14.005051 containerd[1946]: 2025-03-25 01:32:13.888 [INFO][4647] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.6.64/26 handle="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" host="ip-172-31-28-242" Mar 25 01:32:14.008518 containerd[1946]: 2025-03-25 01:32:13.895 [INFO][4647] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466 Mar 25 01:32:14.008518 containerd[1946]: 2025-03-25 01:32:13.903 [INFO][4647] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.6.64/26 handle="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" host="ip-172-31-28-242" Mar 25 01:32:14.008518 containerd[1946]: 2025-03-25 01:32:13.926 [INFO][4647] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.6.66/26] block=192.168.6.64/26 handle="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" host="ip-172-31-28-242" Mar 25 01:32:14.008518 containerd[1946]: 2025-03-25 01:32:13.926 [INFO][4647] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.6.66/26] handle="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" host="ip-172-31-28-242" Mar 25 01:32:14.008518 containerd[1946]: 2025-03-25 01:32:13.926 [INFO][4647] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:32:14.008518 containerd[1946]: 2025-03-25 01:32:13.926 [INFO][4647] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.6.66/26] IPv6=[] ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" HandleID="k8s-pod-network.9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Workload="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" Mar 25 01:32:14.008789 containerd[1946]: 2025-03-25 01:32:13.936 [INFO][4622] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7176352e-f194-4c71-92fb-e7f6c7227404", ResourceVersion:"636", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"", Pod:"csi-node-driver-wsw9h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.6.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0f78a08235f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:14.008932 containerd[1946]: 2025-03-25 01:32:13.937 [INFO][4622] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.6.66/32] ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" Mar 25 01:32:14.008932 containerd[1946]: 2025-03-25 01:32:13.937 [INFO][4622] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f78a08235f ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" Mar 25 01:32:14.008932 containerd[1946]: 2025-03-25 01:32:13.949 [INFO][4622] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" Mar 25 01:32:14.009085 containerd[1946]: 2025-03-25 01:32:13.953 [INFO][4622] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7176352e-f194-4c71-92fb-e7f6c7227404", ResourceVersion:"636", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466", Pod:"csi-node-driver-wsw9h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.6.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0f78a08235f", MAC:"06:47:40:34:5e:3e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:14.010673 containerd[1946]: 2025-03-25 01:32:13.995 [INFO][4622] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" Namespace="calico-system" Pod="csi-node-driver-wsw9h" WorkloadEndpoint="ip--172--31--28--242-k8s-csi--node--driver--wsw9h-eth0" Mar 25 01:32:14.039241 systemd-networkd[1793]: vxlan.calico: Link UP Mar 25 01:32:14.039263 systemd-networkd[1793]: vxlan.calico: Gained carrier Mar 25 01:32:14.081593 containerd[1946]: time="2025-03-25T01:32:14.081157838Z" level=info msg="connecting to shim 88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3" address="unix:///run/containerd/s/0a3a1e35d0e1cb706c9fb327e19dd89b7470b1cdf937b70dd0eb03719ca45cef" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:32:14.113240 containerd[1946]: time="2025-03-25T01:32:14.113090736Z" level=info msg="connecting to shim 9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466" address="unix:///run/containerd/s/4a3bd019acdd14b5e8d6746dd2d41f7ae1b5046bd318370b4c767c1bae421d93" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:32:14.160970 (udev-worker)[4414]: Network interface NamePolicy= disabled on kernel command line. Mar 25 01:32:14.227802 systemd[1]: Started cri-containerd-88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3.scope - libcontainer container 88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3. Mar 25 01:32:14.261675 systemd[1]: Started cri-containerd-9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466.scope - libcontainer container 9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466. Mar 25 01:32:14.386829 containerd[1946]: time="2025-03-25T01:32:14.386569697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wsw9h,Uid:7176352e-f194-4c71-92fb-e7f6c7227404,Namespace:calico-system,Attempt:0,} returns sandbox id \"9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466\"" Mar 25 01:32:14.427978 containerd[1946]: time="2025-03-25T01:32:14.427744127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:32:14.441408 containerd[1946]: time="2025-03-25T01:32:14.439349809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-755766d89b-p6jm5,Uid:e21ef940-4b30-4e3d-ac52-96c893706884,Namespace:calico-system,Attempt:0,} returns sandbox id \"88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3\"" Mar 25 01:32:15.087693 systemd-networkd[1793]: vxlan.calico: Gained IPv6LL Mar 25 01:32:15.151574 systemd-networkd[1793]: cali0f78a08235f: Gained IPv6LL Mar 25 01:32:15.215505 systemd-networkd[1793]: cali3d6662d496c: Gained IPv6LL Mar 25 01:32:15.352792 containerd[1946]: time="2025-03-25T01:32:15.352559479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tp7ll,Uid:9cd52328-d1d7-42d0-962c-7cd987215243,Namespace:kube-system,Attempt:0,}" Mar 25 01:32:15.617567 systemd-networkd[1793]: cali8a9ec66a81a: Link UP Mar 25 01:32:15.618088 systemd-networkd[1793]: cali8a9ec66a81a: Gained carrier Mar 25 01:32:15.655428 containerd[1946]: 2025-03-25 01:32:15.438 [INFO][4841] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0 coredns-668d6bf9bc- kube-system 9cd52328-d1d7-42d0-962c-7cd987215243 718 0 2025-03-25 01:31:35 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-28-242 coredns-668d6bf9bc-tp7ll eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8a9ec66a81a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-" Mar 25 01:32:15.655428 containerd[1946]: 2025-03-25 01:32:15.438 [INFO][4841] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" Mar 25 01:32:15.655428 containerd[1946]: 2025-03-25 01:32:15.508 [INFO][4852] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" HandleID="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Workload="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.533 [INFO][4852] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" HandleID="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Workload="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002209f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-28-242", "pod":"coredns-668d6bf9bc-tp7ll", "timestamp":"2025-03-25 01:32:15.508085737 +0000 UTC"}, Hostname:"ip-172-31-28-242", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.534 [INFO][4852] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.534 [INFO][4852] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.534 [INFO][4852] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-242' Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.541 [INFO][4852] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" host="ip-172-31-28-242" Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.558 [INFO][4852] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-28-242" Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.573 [INFO][4852] ipam/ipam.go 489: Trying affinity for 192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.576 [INFO][4852] ipam/ipam.go 155: Attempting to load block cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:15.656314 containerd[1946]: 2025-03-25 01:32:15.580 [INFO][4852] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:15.658118 containerd[1946]: 2025-03-25 01:32:15.580 [INFO][4852] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.6.64/26 handle="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" host="ip-172-31-28-242" Mar 25 01:32:15.658118 containerd[1946]: 2025-03-25 01:32:15.583 [INFO][4852] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63 Mar 25 01:32:15.658118 containerd[1946]: 2025-03-25 01:32:15.594 [INFO][4852] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.6.64/26 handle="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" host="ip-172-31-28-242" Mar 25 01:32:15.658118 containerd[1946]: 2025-03-25 01:32:15.606 [INFO][4852] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.6.67/26] block=192.168.6.64/26 handle="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" host="ip-172-31-28-242" Mar 25 01:32:15.658118 containerd[1946]: 2025-03-25 01:32:15.606 [INFO][4852] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.6.67/26] handle="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" host="ip-172-31-28-242" Mar 25 01:32:15.658118 containerd[1946]: 2025-03-25 01:32:15.606 [INFO][4852] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:32:15.658118 containerd[1946]: 2025-03-25 01:32:15.607 [INFO][4852] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.6.67/26] IPv6=[] ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" HandleID="k8s-pod-network.0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Workload="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" Mar 25 01:32:15.659532 containerd[1946]: 2025-03-25 01:32:15.611 [INFO][4841] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9cd52328-d1d7-42d0-962c-7cd987215243", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"", Pod:"coredns-668d6bf9bc-tp7ll", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8a9ec66a81a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:15.659820 containerd[1946]: 2025-03-25 01:32:15.611 [INFO][4841] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.6.67/32] ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" Mar 25 01:32:15.659820 containerd[1946]: 2025-03-25 01:32:15.611 [INFO][4841] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8a9ec66a81a ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" Mar 25 01:32:15.659820 containerd[1946]: 2025-03-25 01:32:15.616 [INFO][4841] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" Mar 25 01:32:15.660124 containerd[1946]: 2025-03-25 01:32:15.616 [INFO][4841] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9cd52328-d1d7-42d0-962c-7cd987215243", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63", Pod:"coredns-668d6bf9bc-tp7ll", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8a9ec66a81a", MAC:"82:c0:83:90:c8:57", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:15.660124 containerd[1946]: 2025-03-25 01:32:15.645 [INFO][4841] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" Namespace="kube-system" Pod="coredns-668d6bf9bc-tp7ll" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--tp7ll-eth0" Mar 25 01:32:15.724332 containerd[1946]: time="2025-03-25T01:32:15.724076269Z" level=info msg="connecting to shim 0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63" address="unix:///run/containerd/s/e753e7bda62891cbfe32ff0b9957469b73ab3ed5d8ab12754cde3c811791a2c6" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:32:15.784527 systemd[1]: Started cri-containerd-0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63.scope - libcontainer container 0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63. Mar 25 01:32:15.883390 containerd[1946]: time="2025-03-25T01:32:15.882129579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tp7ll,Uid:9cd52328-d1d7-42d0-962c-7cd987215243,Namespace:kube-system,Attempt:0,} returns sandbox id \"0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63\"" Mar 25 01:32:15.909499 containerd[1946]: time="2025-03-25T01:32:15.909427683Z" level=info msg="CreateContainer within sandbox \"0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:32:15.936219 containerd[1946]: time="2025-03-25T01:32:15.934738681Z" level=info msg="Container 256c626aa3c500d4d7587f24ed21937f4aa8184f8f74bf1a42e14542cb06dec6: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:15.959249 containerd[1946]: time="2025-03-25T01:32:15.958686681Z" level=info msg="CreateContainer within sandbox \"0b18e55a6020f1bc5936b0cca755ec4802a3c4abefb3c2f6932aaacfd812dd63\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"256c626aa3c500d4d7587f24ed21937f4aa8184f8f74bf1a42e14542cb06dec6\"" Mar 25 01:32:15.962850 containerd[1946]: time="2025-03-25T01:32:15.962010507Z" level=info msg="StartContainer for \"256c626aa3c500d4d7587f24ed21937f4aa8184f8f74bf1a42e14542cb06dec6\"" Mar 25 01:32:15.968921 containerd[1946]: time="2025-03-25T01:32:15.968842601Z" level=info msg="connecting to shim 256c626aa3c500d4d7587f24ed21937f4aa8184f8f74bf1a42e14542cb06dec6" address="unix:///run/containerd/s/e753e7bda62891cbfe32ff0b9957469b73ab3ed5d8ab12754cde3c811791a2c6" protocol=ttrpc version=3 Mar 25 01:32:16.021553 systemd[1]: Started cri-containerd-256c626aa3c500d4d7587f24ed21937f4aa8184f8f74bf1a42e14542cb06dec6.scope - libcontainer container 256c626aa3c500d4d7587f24ed21937f4aa8184f8f74bf1a42e14542cb06dec6. Mar 25 01:32:16.114903 containerd[1946]: time="2025-03-25T01:32:16.114355728Z" level=info msg="StartContainer for \"256c626aa3c500d4d7587f24ed21937f4aa8184f8f74bf1a42e14542cb06dec6\" returns successfully" Mar 25 01:32:16.223506 systemd[1]: Started sshd@8-172.31.28.242:22-147.75.109.163:37952.service - OpenSSH per-connection server daemon (147.75.109.163:37952). Mar 25 01:32:16.354400 containerd[1946]: time="2025-03-25T01:32:16.353896299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cpcqx,Uid:4ae8b6c3-7961-411a-b8a7-db8f68e68f33,Namespace:kube-system,Attempt:0,}" Mar 25 01:32:16.357086 containerd[1946]: time="2025-03-25T01:32:16.356963369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-trthd,Uid:dadf09ef-70e2-4836-a7ba-e5b0a0c652dd,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:32:16.470673 sshd[4953]: Accepted publickey for core from 147.75.109.163 port 37952 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:16.477934 sshd-session[4953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:16.505977 systemd-logind[1929]: New session 9 of user core. Mar 25 01:32:16.516753 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:32:16.997704 sshd[4979]: Connection closed by 147.75.109.163 port 37952 Mar 25 01:32:16.997167 sshd-session[4953]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:17.015899 systemd[1]: sshd@8-172.31.28.242:22-147.75.109.163:37952.service: Deactivated successfully. Mar 25 01:32:17.026900 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:32:17.032620 systemd-logind[1929]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:32:17.039115 systemd-logind[1929]: Removed session 9. Mar 25 01:32:17.133816 systemd-networkd[1793]: calie8f00bf9d76: Link UP Mar 25 01:32:17.136116 systemd-networkd[1793]: calie8f00bf9d76: Gained carrier Mar 25 01:32:17.136479 systemd-networkd[1793]: cali8a9ec66a81a: Gained IPv6LL Mar 25 01:32:17.205290 kubelet[3424]: I0325 01:32:17.204906 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-tp7ll" podStartSLOduration=42.204874368 podStartE2EDuration="42.204874368s" podCreationTimestamp="2025-03-25 01:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:32:17.003913806 +0000 UTC m=+47.900346298" watchObservedRunningTime="2025-03-25 01:32:17.204874368 +0000 UTC m=+48.101306860" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.588 [INFO][4966] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0 calico-apiserver-756bdfdf4c- calico-apiserver dadf09ef-70e2-4836-a7ba-e5b0a0c652dd 721 0 2025-03-25 01:31:47 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:756bdfdf4c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-28-242 calico-apiserver-756bdfdf4c-trthd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie8f00bf9d76 [] []}} ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.589 [INFO][4966] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.802 [INFO][4993] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" HandleID="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Workload="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.910 [INFO][4993] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" HandleID="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Workload="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40000e6fa0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-28-242", "pod":"calico-apiserver-756bdfdf4c-trthd", "timestamp":"2025-03-25 01:32:16.802909429 +0000 UTC"}, Hostname:"ip-172-31-28-242", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.910 [INFO][4993] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.910 [INFO][4993] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.910 [INFO][4993] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-242' Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.931 [INFO][4993] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:16.980 [INFO][4993] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.024 [INFO][4993] ipam/ipam.go 489: Trying affinity for 192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.034 [INFO][4993] ipam/ipam.go 155: Attempting to load block cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.052 [INFO][4993] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.052 [INFO][4993] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.6.64/26 handle="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.057 [INFO][4993] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.069 [INFO][4993] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.6.64/26 handle="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.088 [INFO][4993] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.6.68/26] block=192.168.6.64/26 handle="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.088 [INFO][4993] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.6.68/26] handle="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" host="ip-172-31-28-242" Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.089 [INFO][4993] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:32:17.225780 containerd[1946]: 2025-03-25 01:32:17.089 [INFO][4993] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.6.68/26] IPv6=[] ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" HandleID="k8s-pod-network.a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Workload="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" Mar 25 01:32:17.230977 containerd[1946]: 2025-03-25 01:32:17.113 [INFO][4966] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0", GenerateName:"calico-apiserver-756bdfdf4c-", Namespace:"calico-apiserver", SelfLink:"", UID:"dadf09ef-70e2-4836-a7ba-e5b0a0c652dd", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"756bdfdf4c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"", Pod:"calico-apiserver-756bdfdf4c-trthd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie8f00bf9d76", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:17.230977 containerd[1946]: 2025-03-25 01:32:17.116 [INFO][4966] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.6.68/32] ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" Mar 25 01:32:17.230977 containerd[1946]: 2025-03-25 01:32:17.116 [INFO][4966] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8f00bf9d76 ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" Mar 25 01:32:17.230977 containerd[1946]: 2025-03-25 01:32:17.142 [INFO][4966] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" Mar 25 01:32:17.230977 containerd[1946]: 2025-03-25 01:32:17.144 [INFO][4966] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0", GenerateName:"calico-apiserver-756bdfdf4c-", Namespace:"calico-apiserver", SelfLink:"", UID:"dadf09ef-70e2-4836-a7ba-e5b0a0c652dd", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"756bdfdf4c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e", Pod:"calico-apiserver-756bdfdf4c-trthd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie8f00bf9d76", MAC:"82:18:d5:64:00:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:17.230977 containerd[1946]: 2025-03-25 01:32:17.203 [INFO][4966] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-trthd" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--trthd-eth0" Mar 25 01:32:17.248416 containerd[1946]: time="2025-03-25T01:32:17.248094915Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:17.255892 containerd[1946]: time="2025-03-25T01:32:17.255282751Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 25 01:32:17.260698 containerd[1946]: time="2025-03-25T01:32:17.260450774Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:17.277328 containerd[1946]: time="2025-03-25T01:32:17.277032215Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:17.288417 containerd[1946]: time="2025-03-25T01:32:17.287457160Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 2.8596427s" Mar 25 01:32:17.288417 containerd[1946]: time="2025-03-25T01:32:17.287539403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 25 01:32:17.302833 containerd[1946]: time="2025-03-25T01:32:17.301780479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:32:17.331401 systemd-networkd[1793]: cali4d61677e8a9: Link UP Mar 25 01:32:17.335775 containerd[1946]: time="2025-03-25T01:32:17.334147211Z" level=info msg="CreateContainer within sandbox \"9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:32:17.343713 systemd-networkd[1793]: cali4d61677e8a9: Gained carrier Mar 25 01:32:17.385935 containerd[1946]: time="2025-03-25T01:32:17.385863408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-tf9gh,Uid:cd448025-947d-4e3d-9cb5-a93ca6dbbf1e,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:32:17.424367 containerd[1946]: time="2025-03-25T01:32:17.424276292Z" level=info msg="Container 053f93cbad301d9c1e2b7d1bb0cee5569736542fcdee33e428450910af31925f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:16.568 [INFO][4955] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0 coredns-668d6bf9bc- kube-system 4ae8b6c3-7961-411a-b8a7-db8f68e68f33 720 0 2025-03-25 01:31:35 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-28-242 coredns-668d6bf9bc-cpcqx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4d61677e8a9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:16.569 [INFO][4955] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:16.828 [INFO][4987] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" HandleID="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Workload="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:16.937 [INFO][4987] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" HandleID="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Workload="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003eb760), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-28-242", "pod":"coredns-668d6bf9bc-cpcqx", "timestamp":"2025-03-25 01:32:16.828414359 +0000 UTC"}, Hostname:"ip-172-31-28-242", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:16.938 [INFO][4987] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.104 [INFO][4987] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.104 [INFO][4987] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-242' Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.111 [INFO][4987] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.135 [INFO][4987] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.178 [INFO][4987] ipam/ipam.go 489: Trying affinity for 192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.186 [INFO][4987] ipam/ipam.go 155: Attempting to load block cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.220 [INFO][4987] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.220 [INFO][4987] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.6.64/26 handle="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.234 [INFO][4987] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.249 [INFO][4987] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.6.64/26 handle="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.283 [INFO][4987] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.6.69/26] block=192.168.6.64/26 handle="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.284 [INFO][4987] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.6.69/26] handle="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" host="ip-172-31-28-242" Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.284 [INFO][4987] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:32:17.429096 containerd[1946]: 2025-03-25 01:32:17.284 [INFO][4987] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.6.69/26] IPv6=[] ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" HandleID="k8s-pod-network.e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Workload="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" Mar 25 01:32:17.430728 containerd[1946]: 2025-03-25 01:32:17.292 [INFO][4955] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4ae8b6c3-7961-411a-b8a7-db8f68e68f33", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"", Pod:"coredns-668d6bf9bc-cpcqx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4d61677e8a9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:17.430728 containerd[1946]: 2025-03-25 01:32:17.295 [INFO][4955] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.6.69/32] ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" Mar 25 01:32:17.430728 containerd[1946]: 2025-03-25 01:32:17.295 [INFO][4955] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4d61677e8a9 ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" Mar 25 01:32:17.430728 containerd[1946]: 2025-03-25 01:32:17.348 [INFO][4955] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" Mar 25 01:32:17.430728 containerd[1946]: 2025-03-25 01:32:17.352 [INFO][4955] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4ae8b6c3-7961-411a-b8a7-db8f68e68f33", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc", Pod:"coredns-668d6bf9bc-cpcqx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.6.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4d61677e8a9", MAC:"e2:76:43:c5:a2:96", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:17.430728 containerd[1946]: 2025-03-25 01:32:17.403 [INFO][4955] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" Namespace="kube-system" Pod="coredns-668d6bf9bc-cpcqx" WorkloadEndpoint="ip--172--31--28--242-k8s-coredns--668d6bf9bc--cpcqx-eth0" Mar 25 01:32:17.440790 containerd[1946]: time="2025-03-25T01:32:17.440682513Z" level=info msg="connecting to shim a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e" address="unix:///run/containerd/s/03afab8335b4196b861c676913fa5b5b79830eb84f2f8e5fd389de5e6f79e64d" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:32:17.513538 containerd[1946]: time="2025-03-25T01:32:17.512317434Z" level=info msg="CreateContainer within sandbox \"9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"053f93cbad301d9c1e2b7d1bb0cee5569736542fcdee33e428450910af31925f\"" Mar 25 01:32:17.517878 containerd[1946]: time="2025-03-25T01:32:17.515850243Z" level=info msg="StartContainer for \"053f93cbad301d9c1e2b7d1bb0cee5569736542fcdee33e428450910af31925f\"" Mar 25 01:32:17.583854 containerd[1946]: time="2025-03-25T01:32:17.583792103Z" level=info msg="connecting to shim 053f93cbad301d9c1e2b7d1bb0cee5569736542fcdee33e428450910af31925f" address="unix:///run/containerd/s/4a3bd019acdd14b5e8d6746dd2d41f7ae1b5046bd318370b4c767c1bae421d93" protocol=ttrpc version=3 Mar 25 01:32:17.653540 containerd[1946]: time="2025-03-25T01:32:17.653459492Z" level=info msg="connecting to shim e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc" address="unix:///run/containerd/s/9aae09385956cac9031a5b605c87dca354664365cc94a9b98043614b81ca247c" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:32:17.669880 systemd[1]: Started cri-containerd-a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e.scope - libcontainer container a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e. Mar 25 01:32:17.739419 systemd[1]: Started cri-containerd-053f93cbad301d9c1e2b7d1bb0cee5569736542fcdee33e428450910af31925f.scope - libcontainer container 053f93cbad301d9c1e2b7d1bb0cee5569736542fcdee33e428450910af31925f. Mar 25 01:32:17.802512 systemd[1]: Started cri-containerd-e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc.scope - libcontainer container e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc. Mar 25 01:32:18.002874 containerd[1946]: time="2025-03-25T01:32:18.002722866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-cpcqx,Uid:4ae8b6c3-7961-411a-b8a7-db8f68e68f33,Namespace:kube-system,Attempt:0,} returns sandbox id \"e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc\"" Mar 25 01:32:18.016502 containerd[1946]: time="2025-03-25T01:32:18.013588279Z" level=info msg="CreateContainer within sandbox \"e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:32:18.079519 containerd[1946]: time="2025-03-25T01:32:18.079029882Z" level=info msg="Container a4255e557e4eb36892537a06608d7d09d751ef75bb2ce5f8ab27c198583c8009: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:18.106569 containerd[1946]: time="2025-03-25T01:32:18.106480154Z" level=info msg="CreateContainer within sandbox \"e5529ead36154e5841170acc72439ce2289c4d7b1a8fa1556be93a41c53da5fc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a4255e557e4eb36892537a06608d7d09d751ef75bb2ce5f8ab27c198583c8009\"" Mar 25 01:32:18.111927 containerd[1946]: time="2025-03-25T01:32:18.109569245Z" level=info msg="StartContainer for \"a4255e557e4eb36892537a06608d7d09d751ef75bb2ce5f8ab27c198583c8009\"" Mar 25 01:32:18.113154 containerd[1946]: time="2025-03-25T01:32:18.113015541Z" level=info msg="connecting to shim a4255e557e4eb36892537a06608d7d09d751ef75bb2ce5f8ab27c198583c8009" address="unix:///run/containerd/s/9aae09385956cac9031a5b605c87dca354664365cc94a9b98043614b81ca247c" protocol=ttrpc version=3 Mar 25 01:32:18.239861 systemd[1]: Started cri-containerd-a4255e557e4eb36892537a06608d7d09d751ef75bb2ce5f8ab27c198583c8009.scope - libcontainer container a4255e557e4eb36892537a06608d7d09d751ef75bb2ce5f8ab27c198583c8009. Mar 25 01:32:18.277162 systemd-networkd[1793]: calic8a46da42df: Link UP Mar 25 01:32:18.284634 systemd-networkd[1793]: calic8a46da42df: Gained carrier Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:17.786 [INFO][5040] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0 calico-apiserver-756bdfdf4c- calico-apiserver cd448025-947d-4e3d-9cb5-a93ca6dbbf1e 722 0 2025-03-25 01:31:47 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:756bdfdf4c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-28-242 calico-apiserver-756bdfdf4c-tf9gh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic8a46da42df [] []}} ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:17.786 [INFO][5040] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:17.917 [INFO][5138] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" HandleID="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Workload="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.076 [INFO][5138] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" HandleID="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Workload="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400043e1b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-28-242", "pod":"calico-apiserver-756bdfdf4c-tf9gh", "timestamp":"2025-03-25 01:32:17.917313904 +0000 UTC"}, Hostname:"ip-172-31-28-242", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.081 [INFO][5138] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.081 [INFO][5138] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.081 [INFO][5138] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-28-242' Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.092 [INFO][5138] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.104 [INFO][5138] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.136 [INFO][5138] ipam/ipam.go 489: Trying affinity for 192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.167 [INFO][5138] ipam/ipam.go 155: Attempting to load block cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.181 [INFO][5138] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.6.64/26 host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.182 [INFO][5138] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.6.64/26 handle="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.190 [INFO][5138] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8 Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.222 [INFO][5138] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.6.64/26 handle="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.252 [INFO][5138] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.6.70/26] block=192.168.6.64/26 handle="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.252 [INFO][5138] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.6.70/26] handle="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" host="ip-172-31-28-242" Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.253 [INFO][5138] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:32:18.351557 containerd[1946]: 2025-03-25 01:32:18.253 [INFO][5138] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.6.70/26] IPv6=[] ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" HandleID="k8s-pod-network.8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Workload="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" Mar 25 01:32:18.357160 containerd[1946]: 2025-03-25 01:32:18.259 [INFO][5040] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0", GenerateName:"calico-apiserver-756bdfdf4c-", Namespace:"calico-apiserver", SelfLink:"", UID:"cd448025-947d-4e3d-9cb5-a93ca6dbbf1e", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"756bdfdf4c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"", Pod:"calico-apiserver-756bdfdf4c-tf9gh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic8a46da42df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:18.357160 containerd[1946]: 2025-03-25 01:32:18.260 [INFO][5040] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.6.70/32] ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" Mar 25 01:32:18.357160 containerd[1946]: 2025-03-25 01:32:18.260 [INFO][5040] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic8a46da42df ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" Mar 25 01:32:18.357160 containerd[1946]: 2025-03-25 01:32:18.283 [INFO][5040] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" Mar 25 01:32:18.357160 containerd[1946]: 2025-03-25 01:32:18.288 [INFO][5040] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0", GenerateName:"calico-apiserver-756bdfdf4c-", Namespace:"calico-apiserver", SelfLink:"", UID:"cd448025-947d-4e3d-9cb5-a93ca6dbbf1e", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 31, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"756bdfdf4c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-28-242", ContainerID:"8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8", Pod:"calico-apiserver-756bdfdf4c-tf9gh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.6.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic8a46da42df", MAC:"d6:4d:a0:22:c3:7f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:32:18.357160 containerd[1946]: 2025-03-25 01:32:18.337 [INFO][5040] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" Namespace="calico-apiserver" Pod="calico-apiserver-756bdfdf4c-tf9gh" WorkloadEndpoint="ip--172--31--28--242-k8s-calico--apiserver--756bdfdf4c--tf9gh-eth0" Mar 25 01:32:18.411991 containerd[1946]: time="2025-03-25T01:32:18.411897998Z" level=info msg="StartContainer for \"a4255e557e4eb36892537a06608d7d09d751ef75bb2ce5f8ab27c198583c8009\" returns successfully" Mar 25 01:32:18.449370 containerd[1946]: time="2025-03-25T01:32:18.448240429Z" level=info msg="connecting to shim 8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8" address="unix:///run/containerd/s/16893a5b9a82fe894cdceeecc6c5a4aa8d4845c605ee8254b43205dbab271f64" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:32:18.542596 systemd[1]: Started cri-containerd-8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8.scope - libcontainer container 8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8. Mar 25 01:32:18.668521 containerd[1946]: time="2025-03-25T01:32:18.666571266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-trthd,Uid:dadf09ef-70e2-4836-a7ba-e5b0a0c652dd,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e\"" Mar 25 01:32:18.713675 containerd[1946]: time="2025-03-25T01:32:18.711071645Z" level=info msg="StartContainer for \"053f93cbad301d9c1e2b7d1bb0cee5569736542fcdee33e428450910af31925f\" returns successfully" Mar 25 01:32:18.822767 containerd[1946]: time="2025-03-25T01:32:18.821278958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-756bdfdf4c-tf9gh,Uid:cd448025-947d-4e3d-9cb5-a93ca6dbbf1e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8\"" Mar 25 01:32:18.927444 systemd-networkd[1793]: calie8f00bf9d76: Gained IPv6LL Mar 25 01:32:18.944944 kubelet[3424]: I0325 01:32:18.944466 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-cpcqx" podStartSLOduration=43.944437896 podStartE2EDuration="43.944437896s" podCreationTimestamp="2025-03-25 01:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:32:18.939332638 +0000 UTC m=+49.835765130" watchObservedRunningTime="2025-03-25 01:32:18.944437896 +0000 UTC m=+49.840870400" Mar 25 01:32:19.056628 systemd-networkd[1793]: cali4d61677e8a9: Gained IPv6LL Mar 25 01:32:19.312591 systemd-networkd[1793]: calic8a46da42df: Gained IPv6LL Mar 25 01:32:20.628718 containerd[1946]: time="2025-03-25T01:32:20.627141480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:20.633721 containerd[1946]: time="2025-03-25T01:32:20.633420411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 25 01:32:20.636515 containerd[1946]: time="2025-03-25T01:32:20.636396938Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:20.640678 containerd[1946]: time="2025-03-25T01:32:20.640549273Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:20.642139 containerd[1946]: time="2025-03-25T01:32:20.641937039Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 3.339581747s" Mar 25 01:32:20.642139 containerd[1946]: time="2025-03-25T01:32:20.642055348Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 25 01:32:20.646631 containerd[1946]: time="2025-03-25T01:32:20.646382999Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:32:20.683679 containerd[1946]: time="2025-03-25T01:32:20.683585964Z" level=info msg="CreateContainer within sandbox \"88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:32:20.707275 containerd[1946]: time="2025-03-25T01:32:20.703701509Z" level=info msg="Container cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:20.734677 containerd[1946]: time="2025-03-25T01:32:20.734584044Z" level=info msg="CreateContainer within sandbox \"88b47ed3695d67771dee17aaf02c468e219f4aaa25e3d72fa75444ca9a82fec3\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\"" Mar 25 01:32:20.737868 containerd[1946]: time="2025-03-25T01:32:20.737760343Z" level=info msg="StartContainer for \"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\"" Mar 25 01:32:20.741567 containerd[1946]: time="2025-03-25T01:32:20.741436337Z" level=info msg="connecting to shim cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d" address="unix:///run/containerd/s/0a3a1e35d0e1cb706c9fb327e19dd89b7470b1cdf937b70dd0eb03719ca45cef" protocol=ttrpc version=3 Mar 25 01:32:20.803601 systemd[1]: Started cri-containerd-cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d.scope - libcontainer container cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d. Mar 25 01:32:20.961744 containerd[1946]: time="2025-03-25T01:32:20.959285290Z" level=info msg="StartContainer for \"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\" returns successfully" Mar 25 01:32:21.798119 ntpd[1923]: Listen normally on 7 vxlan.calico 192.168.6.64:123 Mar 25 01:32:21.798790 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 7 vxlan.calico 192.168.6.64:123 Mar 25 01:32:21.798790 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 8 cali3d6662d496c [fe80::ecee:eeff:feee:eeee%4]:123 Mar 25 01:32:21.798790 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 9 cali0f78a08235f [fe80::ecee:eeff:feee:eeee%5]:123 Mar 25 01:32:21.798790 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 10 vxlan.calico [fe80::64c9:3aff:fe48:a7c%6]:123 Mar 25 01:32:21.798790 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 11 cali8a9ec66a81a [fe80::ecee:eeff:feee:eeee%9]:123 Mar 25 01:32:21.798790 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 12 calie8f00bf9d76 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 25 01:32:21.798285 ntpd[1923]: Listen normally on 8 cali3d6662d496c [fe80::ecee:eeff:feee:eeee%4]:123 Mar 25 01:32:21.799414 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 13 cali4d61677e8a9 [fe80::ecee:eeff:feee:eeee%11]:123 Mar 25 01:32:21.799414 ntpd[1923]: 25 Mar 01:32:21 ntpd[1923]: Listen normally on 14 calic8a46da42df [fe80::ecee:eeff:feee:eeee%12]:123 Mar 25 01:32:21.798382 ntpd[1923]: Listen normally on 9 cali0f78a08235f [fe80::ecee:eeff:feee:eeee%5]:123 Mar 25 01:32:21.798456 ntpd[1923]: Listen normally on 10 vxlan.calico [fe80::64c9:3aff:fe48:a7c%6]:123 Mar 25 01:32:21.798524 ntpd[1923]: Listen normally on 11 cali8a9ec66a81a [fe80::ecee:eeff:feee:eeee%9]:123 Mar 25 01:32:21.798785 ntpd[1923]: Listen normally on 12 calie8f00bf9d76 [fe80::ecee:eeff:feee:eeee%10]:123 Mar 25 01:32:21.798863 ntpd[1923]: Listen normally on 13 cali4d61677e8a9 [fe80::ecee:eeff:feee:eeee%11]:123 Mar 25 01:32:21.798929 ntpd[1923]: Listen normally on 14 calic8a46da42df [fe80::ecee:eeff:feee:eeee%12]:123 Mar 25 01:32:22.039264 systemd[1]: Started sshd@9-172.31.28.242:22-147.75.109.163:56958.service - OpenSSH per-connection server daemon (147.75.109.163:56958). Mar 25 01:32:22.265347 sshd[5325]: Accepted publickey for core from 147.75.109.163 port 56958 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:22.268663 sshd-session[5325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:22.279524 systemd-logind[1929]: New session 10 of user core. Mar 25 01:32:22.290510 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:32:22.578452 sshd[5327]: Connection closed by 147.75.109.163 port 56958 Mar 25 01:32:22.578164 sshd-session[5325]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:22.587833 systemd[1]: sshd@9-172.31.28.242:22-147.75.109.163:56958.service: Deactivated successfully. Mar 25 01:32:22.593889 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:32:22.596994 systemd-logind[1929]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:32:22.599706 systemd-logind[1929]: Removed session 10. Mar 25 01:32:23.015616 containerd[1946]: time="2025-03-25T01:32:23.015556027Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\" id:\"1f4312e09c5700113b91183264e24a82f128cba4784ccd59584229bc33bc9f61\" pid:5358 exited_at:{seconds:1742866343 nanos:15012806}" Mar 25 01:32:23.041826 kubelet[3424]: I0325 01:32:23.041724 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-755766d89b-p6jm5" podStartSLOduration=28.840849151 podStartE2EDuration="35.041700608s" podCreationTimestamp="2025-03-25 01:31:48 +0000 UTC" firstStartedPulling="2025-03-25 01:32:14.443485317 +0000 UTC m=+45.339917809" lastFinishedPulling="2025-03-25 01:32:20.644336774 +0000 UTC m=+51.540769266" observedRunningTime="2025-03-25 01:32:21.985526296 +0000 UTC m=+52.881958788" watchObservedRunningTime="2025-03-25 01:32:23.041700608 +0000 UTC m=+53.938133100" Mar 25 01:32:24.698535 containerd[1946]: time="2025-03-25T01:32:24.696942322Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:24.700291 containerd[1946]: time="2025-03-25T01:32:24.700109302Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 25 01:32:24.703277 containerd[1946]: time="2025-03-25T01:32:24.702998932Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:24.709945 containerd[1946]: time="2025-03-25T01:32:24.709856934Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:24.712356 containerd[1946]: time="2025-03-25T01:32:24.711864921Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 4.065397113s" Mar 25 01:32:24.712356 containerd[1946]: time="2025-03-25T01:32:24.712015997Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:32:24.716662 containerd[1946]: time="2025-03-25T01:32:24.716498635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:32:24.722056 containerd[1946]: time="2025-03-25T01:32:24.720974568Z" level=info msg="CreateContainer within sandbox \"a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:32:24.742949 containerd[1946]: time="2025-03-25T01:32:24.742883780Z" level=info msg="Container 744cb1aa2f633527cc4208c632ce2999f207c3b4bbd583895630ccb65bdafb09: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:24.763053 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3180570871.mount: Deactivated successfully. Mar 25 01:32:24.774394 containerd[1946]: time="2025-03-25T01:32:24.774165427Z" level=info msg="CreateContainer within sandbox \"a162a9cf2913c2acb31e2d44e7b7f63e0c824259c205911b22a7c20fa735432e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"744cb1aa2f633527cc4208c632ce2999f207c3b4bbd583895630ccb65bdafb09\"" Mar 25 01:32:24.777316 containerd[1946]: time="2025-03-25T01:32:24.775577277Z" level=info msg="StartContainer for \"744cb1aa2f633527cc4208c632ce2999f207c3b4bbd583895630ccb65bdafb09\"" Mar 25 01:32:24.778978 containerd[1946]: time="2025-03-25T01:32:24.778923087Z" level=info msg="connecting to shim 744cb1aa2f633527cc4208c632ce2999f207c3b4bbd583895630ccb65bdafb09" address="unix:///run/containerd/s/03afab8335b4196b861c676913fa5b5b79830eb84f2f8e5fd389de5e6f79e64d" protocol=ttrpc version=3 Mar 25 01:32:24.823536 systemd[1]: Started cri-containerd-744cb1aa2f633527cc4208c632ce2999f207c3b4bbd583895630ccb65bdafb09.scope - libcontainer container 744cb1aa2f633527cc4208c632ce2999f207c3b4bbd583895630ccb65bdafb09. Mar 25 01:32:24.917101 containerd[1946]: time="2025-03-25T01:32:24.917028528Z" level=info msg="StartContainer for \"744cb1aa2f633527cc4208c632ce2999f207c3b4bbd583895630ccb65bdafb09\" returns successfully" Mar 25 01:32:24.991677 kubelet[3424]: I0325 01:32:24.991141 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-756bdfdf4c-trthd" podStartSLOduration=31.95284635 podStartE2EDuration="37.991118225s" podCreationTimestamp="2025-03-25 01:31:47 +0000 UTC" firstStartedPulling="2025-03-25 01:32:18.676386664 +0000 UTC m=+49.572819144" lastFinishedPulling="2025-03-25 01:32:24.714658456 +0000 UTC m=+55.611091019" observedRunningTime="2025-03-25 01:32:24.990773722 +0000 UTC m=+55.887206226" watchObservedRunningTime="2025-03-25 01:32:24.991118225 +0000 UTC m=+55.887550717" Mar 25 01:32:26.695214 containerd[1946]: time="2025-03-25T01:32:26.694905235Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:26.698047 containerd[1946]: time="2025-03-25T01:32:26.697447003Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 25 01:32:26.700484 containerd[1946]: time="2025-03-25T01:32:26.700394373Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:26.706636 containerd[1946]: time="2025-03-25T01:32:26.706530419Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:26.710240 containerd[1946]: time="2025-03-25T01:32:26.710092973Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.993500389s" Mar 25 01:32:26.710240 containerd[1946]: time="2025-03-25T01:32:26.710161627Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 25 01:32:26.715952 containerd[1946]: time="2025-03-25T01:32:26.715602129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:32:26.720102 containerd[1946]: time="2025-03-25T01:32:26.720023585Z" level=info msg="CreateContainer within sandbox \"9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:32:26.746562 containerd[1946]: time="2025-03-25T01:32:26.746479735Z" level=info msg="Container 96573c76a543a727427479ad017eae4c7f78d0e127490c87eb5a96e7093b4fc8: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:26.773399 containerd[1946]: time="2025-03-25T01:32:26.773156778Z" level=info msg="CreateContainer within sandbox \"9d0cd81d9ea16e477f6f26891e21aa1ea38ff2e9db06969387a4f6feaa89d466\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"96573c76a543a727427479ad017eae4c7f78d0e127490c87eb5a96e7093b4fc8\"" Mar 25 01:32:26.774815 containerd[1946]: time="2025-03-25T01:32:26.774587818Z" level=info msg="StartContainer for \"96573c76a543a727427479ad017eae4c7f78d0e127490c87eb5a96e7093b4fc8\"" Mar 25 01:32:26.779844 containerd[1946]: time="2025-03-25T01:32:26.779421999Z" level=info msg="connecting to shim 96573c76a543a727427479ad017eae4c7f78d0e127490c87eb5a96e7093b4fc8" address="unix:///run/containerd/s/4a3bd019acdd14b5e8d6746dd2d41f7ae1b5046bd318370b4c767c1bae421d93" protocol=ttrpc version=3 Mar 25 01:32:26.841991 systemd[1]: Started cri-containerd-96573c76a543a727427479ad017eae4c7f78d0e127490c87eb5a96e7093b4fc8.scope - libcontainer container 96573c76a543a727427479ad017eae4c7f78d0e127490c87eb5a96e7093b4fc8. Mar 25 01:32:26.986574 containerd[1946]: time="2025-03-25T01:32:26.986097583Z" level=info msg="StartContainer for \"96573c76a543a727427479ad017eae4c7f78d0e127490c87eb5a96e7093b4fc8\" returns successfully" Mar 25 01:32:27.057909 containerd[1946]: time="2025-03-25T01:32:27.057829163Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:27.061391 containerd[1946]: time="2025-03-25T01:32:27.061281948Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:32:27.067225 containerd[1946]: time="2025-03-25T01:32:27.067100141Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 351.425928ms" Mar 25 01:32:27.067388 containerd[1946]: time="2025-03-25T01:32:27.067238756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:32:27.076716 containerd[1946]: time="2025-03-25T01:32:27.076117602Z" level=info msg="CreateContainer within sandbox \"8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:32:27.097658 containerd[1946]: time="2025-03-25T01:32:27.097545915Z" level=info msg="Container f1693a42601d6dd8b1a1f85c2d43a3b9d7ae9d33c56c5ae066f132ba9ff0724c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:32:27.120549 containerd[1946]: time="2025-03-25T01:32:27.120454059Z" level=info msg="CreateContainer within sandbox \"8552c6f9614aa52b370f8e68f5ccea9ba01096f7150f74d61933d30d1180c9b8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f1693a42601d6dd8b1a1f85c2d43a3b9d7ae9d33c56c5ae066f132ba9ff0724c\"" Mar 25 01:32:27.122197 containerd[1946]: time="2025-03-25T01:32:27.122113681Z" level=info msg="StartContainer for \"f1693a42601d6dd8b1a1f85c2d43a3b9d7ae9d33c56c5ae066f132ba9ff0724c\"" Mar 25 01:32:27.128712 containerd[1946]: time="2025-03-25T01:32:27.128570315Z" level=info msg="connecting to shim f1693a42601d6dd8b1a1f85c2d43a3b9d7ae9d33c56c5ae066f132ba9ff0724c" address="unix:///run/containerd/s/16893a5b9a82fe894cdceeecc6c5a4aa8d4845c605ee8254b43205dbab271f64" protocol=ttrpc version=3 Mar 25 01:32:27.200386 systemd[1]: Started cri-containerd-f1693a42601d6dd8b1a1f85c2d43a3b9d7ae9d33c56c5ae066f132ba9ff0724c.scope - libcontainer container f1693a42601d6dd8b1a1f85c2d43a3b9d7ae9d33c56c5ae066f132ba9ff0724c. Mar 25 01:32:27.410657 containerd[1946]: time="2025-03-25T01:32:27.409241118Z" level=info msg="StartContainer for \"f1693a42601d6dd8b1a1f85c2d43a3b9d7ae9d33c56c5ae066f132ba9ff0724c\" returns successfully" Mar 25 01:32:27.611171 kubelet[3424]: I0325 01:32:27.610576 3424 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:32:27.611171 kubelet[3424]: I0325 01:32:27.610658 3424 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:32:27.620156 systemd[1]: Started sshd@10-172.31.28.242:22-147.75.109.163:56960.service - OpenSSH per-connection server daemon (147.75.109.163:56960). Mar 25 01:32:27.863842 sshd[5489]: Accepted publickey for core from 147.75.109.163 port 56960 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:27.869959 sshd-session[5489]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:27.883535 systemd-logind[1929]: New session 11 of user core. Mar 25 01:32:27.891533 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:32:28.077535 kubelet[3424]: I0325 01:32:28.075994 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-wsw9h" podStartSLOduration=27.77030166 podStartE2EDuration="40.075953937s" podCreationTimestamp="2025-03-25 01:31:48 +0000 UTC" firstStartedPulling="2025-03-25 01:32:14.407474363 +0000 UTC m=+45.303906855" lastFinishedPulling="2025-03-25 01:32:26.713126652 +0000 UTC m=+57.609559132" observedRunningTime="2025-03-25 01:32:28.028081529 +0000 UTC m=+58.924514033" watchObservedRunningTime="2025-03-25 01:32:28.075953937 +0000 UTC m=+58.972386441" Mar 25 01:32:28.253561 sshd[5491]: Connection closed by 147.75.109.163 port 56960 Mar 25 01:32:28.252374 sshd-session[5489]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:28.262451 systemd[1]: sshd@10-172.31.28.242:22-147.75.109.163:56960.service: Deactivated successfully. Mar 25 01:32:28.270898 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:32:28.278573 systemd-logind[1929]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:32:28.300559 systemd[1]: Started sshd@11-172.31.28.242:22-147.75.109.163:56970.service - OpenSSH per-connection server daemon (147.75.109.163:56970). Mar 25 01:32:28.303547 systemd-logind[1929]: Removed session 11. Mar 25 01:32:28.525648 sshd[5505]: Accepted publickey for core from 147.75.109.163 port 56970 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:28.529114 sshd-session[5505]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:28.546289 systemd-logind[1929]: New session 12 of user core. Mar 25 01:32:28.549530 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:32:29.003026 kubelet[3424]: I0325 01:32:29.002508 3424 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:32:29.044771 sshd[5508]: Connection closed by 147.75.109.163 port 56970 Mar 25 01:32:29.048415 sshd-session[5505]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:29.058085 systemd[1]: sshd@11-172.31.28.242:22-147.75.109.163:56970.service: Deactivated successfully. Mar 25 01:32:29.064530 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:32:29.070067 systemd-logind[1929]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:32:29.092233 systemd[1]: Started sshd@12-172.31.28.242:22-147.75.109.163:56980.service - OpenSSH per-connection server daemon (147.75.109.163:56980). Mar 25 01:32:29.094375 systemd-logind[1929]: Removed session 12. Mar 25 01:32:29.314963 sshd[5520]: Accepted publickey for core from 147.75.109.163 port 56980 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:29.319109 sshd-session[5520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:29.328497 systemd-logind[1929]: New session 13 of user core. Mar 25 01:32:29.339504 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:32:29.654711 sshd[5523]: Connection closed by 147.75.109.163 port 56980 Mar 25 01:32:29.655842 sshd-session[5520]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:29.664780 systemd[1]: sshd@12-172.31.28.242:22-147.75.109.163:56980.service: Deactivated successfully. Mar 25 01:32:29.673905 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:32:29.680394 systemd-logind[1929]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:32:29.684157 systemd-logind[1929]: Removed session 13. Mar 25 01:32:34.688707 systemd[1]: Started sshd@13-172.31.28.242:22-147.75.109.163:57542.service - OpenSSH per-connection server daemon (147.75.109.163:57542). Mar 25 01:32:34.901578 sshd[5543]: Accepted publickey for core from 147.75.109.163 port 57542 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:34.905381 sshd-session[5543]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:34.915519 systemd-logind[1929]: New session 14 of user core. Mar 25 01:32:34.924455 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:32:35.195931 sshd[5551]: Connection closed by 147.75.109.163 port 57542 Mar 25 01:32:35.196915 sshd-session[5543]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:35.204589 systemd[1]: sshd@13-172.31.28.242:22-147.75.109.163:57542.service: Deactivated successfully. Mar 25 01:32:35.209004 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:32:35.211374 systemd-logind[1929]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:32:35.214211 systemd-logind[1929]: Removed session 14. Mar 25 01:32:38.057747 kubelet[3424]: I0325 01:32:38.057083 3424 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:32:38.095216 kubelet[3424]: I0325 01:32:38.094351 3424 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-756bdfdf4c-tf9gh" podStartSLOduration=42.852290686 podStartE2EDuration="51.092849813s" podCreationTimestamp="2025-03-25 01:31:47 +0000 UTC" firstStartedPulling="2025-03-25 01:32:18.827688 +0000 UTC m=+49.724120504" lastFinishedPulling="2025-03-25 01:32:27.06824714 +0000 UTC m=+57.964679631" observedRunningTime="2025-03-25 01:32:28.08187661 +0000 UTC m=+58.978309126" watchObservedRunningTime="2025-03-25 01:32:38.092849813 +0000 UTC m=+68.989282317" Mar 25 01:32:40.238498 systemd[1]: Started sshd@14-172.31.28.242:22-147.75.109.163:38522.service - OpenSSH per-connection server daemon (147.75.109.163:38522). Mar 25 01:32:40.435522 sshd[5572]: Accepted publickey for core from 147.75.109.163 port 38522 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:40.438150 sshd-session[5572]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:40.446734 systemd-logind[1929]: New session 15 of user core. Mar 25 01:32:40.455478 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:32:40.721236 sshd[5574]: Connection closed by 147.75.109.163 port 38522 Mar 25 01:32:40.722108 sshd-session[5572]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:40.729069 systemd[1]: sshd@14-172.31.28.242:22-147.75.109.163:38522.service: Deactivated successfully. Mar 25 01:32:40.734753 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:32:40.736556 systemd-logind[1929]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:32:40.738799 systemd-logind[1929]: Removed session 15. Mar 25 01:32:42.881547 containerd[1946]: time="2025-03-25T01:32:42.881462280Z" level=info msg="TaskExit event in podsandbox handler container_id:\"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\" id:\"62db5591402cf29129225d467deee1d2bbcb96faa5c5d05580811ceed6209766\" pid:5596 exited_at:{seconds:1742866362 nanos:880983072}" Mar 25 01:32:45.755843 systemd[1]: Started sshd@15-172.31.28.242:22-147.75.109.163:38536.service - OpenSSH per-connection server daemon (147.75.109.163:38536). Mar 25 01:32:45.965723 sshd[5613]: Accepted publickey for core from 147.75.109.163 port 38536 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:45.970526 sshd-session[5613]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:45.980377 systemd-logind[1929]: New session 16 of user core. Mar 25 01:32:45.987647 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:32:46.323248 sshd[5615]: Connection closed by 147.75.109.163 port 38536 Mar 25 01:32:46.324493 sshd-session[5613]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:46.333973 systemd[1]: sshd@15-172.31.28.242:22-147.75.109.163:38536.service: Deactivated successfully. Mar 25 01:32:46.342418 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:32:46.348519 systemd-logind[1929]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:32:46.355835 systemd-logind[1929]: Removed session 16. Mar 25 01:32:51.366663 systemd[1]: Started sshd@16-172.31.28.242:22-147.75.109.163:47066.service - OpenSSH per-connection server daemon (147.75.109.163:47066). Mar 25 01:32:51.575044 sshd[5628]: Accepted publickey for core from 147.75.109.163 port 47066 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:51.578643 sshd-session[5628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:51.590458 systemd-logind[1929]: New session 17 of user core. Mar 25 01:32:51.602550 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:32:51.895428 sshd[5630]: Connection closed by 147.75.109.163 port 47066 Mar 25 01:32:51.895968 sshd-session[5628]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:51.909783 systemd[1]: sshd@16-172.31.28.242:22-147.75.109.163:47066.service: Deactivated successfully. Mar 25 01:32:51.918243 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:32:51.926380 systemd-logind[1929]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:32:51.949083 systemd[1]: Started sshd@17-172.31.28.242:22-147.75.109.163:47082.service - OpenSSH per-connection server daemon (147.75.109.163:47082). Mar 25 01:32:51.951047 systemd-logind[1929]: Removed session 17. Mar 25 01:32:52.161672 sshd[5641]: Accepted publickey for core from 147.75.109.163 port 47082 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:52.164609 sshd-session[5641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:52.174689 systemd-logind[1929]: New session 18 of user core. Mar 25 01:32:52.183440 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:32:52.758443 sshd[5644]: Connection closed by 147.75.109.163 port 47082 Mar 25 01:32:52.759908 sshd-session[5641]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:52.768840 systemd[1]: sshd@17-172.31.28.242:22-147.75.109.163:47082.service: Deactivated successfully. Mar 25 01:32:52.773638 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:32:52.775405 systemd-logind[1929]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:32:52.779667 systemd-logind[1929]: Removed session 18. Mar 25 01:32:52.797629 systemd[1]: Started sshd@18-172.31.28.242:22-147.75.109.163:47088.service - OpenSSH per-connection server daemon (147.75.109.163:47088). Mar 25 01:32:53.026056 sshd[5654]: Accepted publickey for core from 147.75.109.163 port 47088 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:53.028914 containerd[1946]: time="2025-03-25T01:32:53.028602862Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\" id:\"0d8b4f1abca8272f566579e148e18b5aa5b86538e5c5c4e810d71fea75d9cdbd\" pid:5668 exited_at:{seconds:1742866373 nanos:27563042}" Mar 25 01:32:53.031300 sshd-session[5654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:53.045928 systemd-logind[1929]: New session 19 of user core. Mar 25 01:32:53.051640 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:32:54.632601 sshd[5677]: Connection closed by 147.75.109.163 port 47088 Mar 25 01:32:54.633059 sshd-session[5654]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:54.645170 systemd[1]: sshd@18-172.31.28.242:22-147.75.109.163:47088.service: Deactivated successfully. Mar 25 01:32:54.655370 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:32:54.666065 systemd-logind[1929]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:32:54.696462 systemd[1]: Started sshd@19-172.31.28.242:22-147.75.109.163:47090.service - OpenSSH per-connection server daemon (147.75.109.163:47090). Mar 25 01:32:54.698363 systemd-logind[1929]: Removed session 19. Mar 25 01:32:54.924693 sshd[5694]: Accepted publickey for core from 147.75.109.163 port 47090 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:54.927696 sshd-session[5694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:54.941709 systemd-logind[1929]: New session 20 of user core. Mar 25 01:32:54.948557 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:32:55.594117 sshd[5702]: Connection closed by 147.75.109.163 port 47090 Mar 25 01:32:55.595583 sshd-session[5694]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:55.608240 systemd[1]: sshd@19-172.31.28.242:22-147.75.109.163:47090.service: Deactivated successfully. Mar 25 01:32:55.617461 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:32:55.622746 systemd-logind[1929]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:32:55.643638 systemd[1]: Started sshd@20-172.31.28.242:22-147.75.109.163:47098.service - OpenSSH per-connection server daemon (147.75.109.163:47098). Mar 25 01:32:55.645684 systemd-logind[1929]: Removed session 20. Mar 25 01:32:55.813160 containerd[1946]: time="2025-03-25T01:32:55.813097286Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\" id:\"157d82f791e6099694471883b57a472808fba0320fab43ebfdb2aff27c6c8bd0\" pid:5727 exited_at:{seconds:1742866375 nanos:812481502}" Mar 25 01:32:55.854159 sshd[5711]: Accepted publickey for core from 147.75.109.163 port 47098 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:32:55.860469 sshd-session[5711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:55.873282 systemd-logind[1929]: New session 21 of user core. Mar 25 01:32:55.885267 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:32:56.186902 sshd[5736]: Connection closed by 147.75.109.163 port 47098 Mar 25 01:32:56.188417 sshd-session[5711]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:56.198678 systemd[1]: sshd@20-172.31.28.242:22-147.75.109.163:47098.service: Deactivated successfully. Mar 25 01:32:56.205107 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:32:56.207351 systemd-logind[1929]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:32:56.209852 systemd-logind[1929]: Removed session 21. Mar 25 01:33:01.229779 systemd[1]: Started sshd@21-172.31.28.242:22-147.75.109.163:47070.service - OpenSSH per-connection server daemon (147.75.109.163:47070). Mar 25 01:33:01.452336 sshd[5750]: Accepted publickey for core from 147.75.109.163 port 47070 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:33:01.456050 sshd-session[5750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:01.468446 systemd-logind[1929]: New session 22 of user core. Mar 25 01:33:01.480207 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 01:33:01.737145 sshd[5752]: Connection closed by 147.75.109.163 port 47070 Mar 25 01:33:01.738475 sshd-session[5750]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:01.745622 systemd[1]: sshd@21-172.31.28.242:22-147.75.109.163:47070.service: Deactivated successfully. Mar 25 01:33:01.752059 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 01:33:01.754306 systemd-logind[1929]: Session 22 logged out. Waiting for processes to exit. Mar 25 01:33:01.757847 systemd-logind[1929]: Removed session 22. Mar 25 01:33:06.774081 systemd[1]: Started sshd@22-172.31.28.242:22-147.75.109.163:47078.service - OpenSSH per-connection server daemon (147.75.109.163:47078). Mar 25 01:33:06.978870 sshd[5769]: Accepted publickey for core from 147.75.109.163 port 47078 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:33:06.981644 sshd-session[5769]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:06.990765 systemd-logind[1929]: New session 23 of user core. Mar 25 01:33:06.997568 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 01:33:07.261981 sshd[5771]: Connection closed by 147.75.109.163 port 47078 Mar 25 01:33:07.263244 sshd-session[5769]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:07.271676 systemd[1]: sshd@22-172.31.28.242:22-147.75.109.163:47078.service: Deactivated successfully. Mar 25 01:33:07.277751 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 01:33:07.281066 systemd-logind[1929]: Session 23 logged out. Waiting for processes to exit. Mar 25 01:33:07.284488 systemd-logind[1929]: Removed session 23. Mar 25 01:33:12.298319 systemd[1]: Started sshd@23-172.31.28.242:22-147.75.109.163:54614.service - OpenSSH per-connection server daemon (147.75.109.163:54614). Mar 25 01:33:12.504403 sshd[5783]: Accepted publickey for core from 147.75.109.163 port 54614 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:33:12.507801 sshd-session[5783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:12.520295 systemd-logind[1929]: New session 24 of user core. Mar 25 01:33:12.530600 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 25 01:33:12.813612 sshd[5785]: Connection closed by 147.75.109.163 port 54614 Mar 25 01:33:12.815135 sshd-session[5783]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:12.828308 systemd[1]: sshd@23-172.31.28.242:22-147.75.109.163:54614.service: Deactivated successfully. Mar 25 01:33:12.839976 systemd[1]: session-24.scope: Deactivated successfully. Mar 25 01:33:12.843227 systemd-logind[1929]: Session 24 logged out. Waiting for processes to exit. Mar 25 01:33:12.847579 systemd-logind[1929]: Removed session 24. Mar 25 01:33:12.884983 containerd[1946]: time="2025-03-25T01:33:12.884532205Z" level=info msg="TaskExit event in podsandbox handler container_id:\"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\" id:\"1834927ec3a61320f13ef8ff0ee7e409bc25fc6ac2f12de550a840bd0b9c4275\" pid:5806 exited_at:{seconds:1742866392 nanos:883639648}" Mar 25 01:33:17.856837 systemd[1]: Started sshd@24-172.31.28.242:22-147.75.109.163:54626.service - OpenSSH per-connection server daemon (147.75.109.163:54626). Mar 25 01:33:18.080624 sshd[5823]: Accepted publickey for core from 147.75.109.163 port 54626 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:33:18.084546 sshd-session[5823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:18.093634 systemd-logind[1929]: New session 25 of user core. Mar 25 01:33:18.099514 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 25 01:33:18.383973 sshd[5825]: Connection closed by 147.75.109.163 port 54626 Mar 25 01:33:18.387745 sshd-session[5823]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:18.398598 systemd[1]: sshd@24-172.31.28.242:22-147.75.109.163:54626.service: Deactivated successfully. Mar 25 01:33:18.404660 systemd[1]: session-25.scope: Deactivated successfully. Mar 25 01:33:18.409801 systemd-logind[1929]: Session 25 logged out. Waiting for processes to exit. Mar 25 01:33:18.411776 systemd-logind[1929]: Removed session 25. Mar 25 01:33:23.005100 containerd[1946]: time="2025-03-25T01:33:23.005009801Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\" id:\"5b3c502549e859eaa0c4bc30908107eeb1ddf39deb24ffa11e85d972fa549bde\" pid:5850 exited_at:{seconds:1742866403 nanos:4165412}" Mar 25 01:33:23.423219 systemd[1]: Started sshd@25-172.31.28.242:22-147.75.109.163:44976.service - OpenSSH per-connection server daemon (147.75.109.163:44976). Mar 25 01:33:23.630230 sshd[5860]: Accepted publickey for core from 147.75.109.163 port 44976 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:33:23.633812 sshd-session[5860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:23.643488 systemd-logind[1929]: New session 26 of user core. Mar 25 01:33:23.653663 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 25 01:33:23.910034 sshd[5862]: Connection closed by 147.75.109.163 port 44976 Mar 25 01:33:23.909312 sshd-session[5860]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:23.915529 systemd[1]: sshd@25-172.31.28.242:22-147.75.109.163:44976.service: Deactivated successfully. Mar 25 01:33:23.920434 systemd[1]: session-26.scope: Deactivated successfully. Mar 25 01:33:23.925946 systemd-logind[1929]: Session 26 logged out. Waiting for processes to exit. Mar 25 01:33:23.929026 systemd-logind[1929]: Removed session 26. Mar 25 01:33:28.960702 systemd[1]: Started sshd@26-172.31.28.242:22-147.75.109.163:44978.service - OpenSSH per-connection server daemon (147.75.109.163:44978). Mar 25 01:33:29.169324 sshd[5873]: Accepted publickey for core from 147.75.109.163 port 44978 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:33:29.171123 sshd-session[5873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:29.179533 systemd-logind[1929]: New session 27 of user core. Mar 25 01:33:29.192472 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 25 01:33:29.458781 sshd[5875]: Connection closed by 147.75.109.163 port 44978 Mar 25 01:33:29.458650 sshd-session[5873]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:29.464913 systemd[1]: sshd@26-172.31.28.242:22-147.75.109.163:44978.service: Deactivated successfully. Mar 25 01:33:29.470071 systemd[1]: session-27.scope: Deactivated successfully. Mar 25 01:33:29.473868 systemd-logind[1929]: Session 27 logged out. Waiting for processes to exit. Mar 25 01:33:29.477332 systemd-logind[1929]: Removed session 27. Mar 25 01:33:34.498241 systemd[1]: Started sshd@27-172.31.28.242:22-147.75.109.163:53956.service - OpenSSH per-connection server daemon (147.75.109.163:53956). Mar 25 01:33:34.705127 sshd[5889]: Accepted publickey for core from 147.75.109.163 port 53956 ssh2: RSA SHA256:iJgztXIsPt+S3SMjthrbarr+NaYlP7obJF21wQZuFyg Mar 25 01:33:34.707854 sshd-session[5889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:34.716576 systemd-logind[1929]: New session 28 of user core. Mar 25 01:33:34.725154 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 25 01:33:34.985066 sshd[5896]: Connection closed by 147.75.109.163 port 53956 Mar 25 01:33:34.984300 sshd-session[5889]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:34.989890 systemd-logind[1929]: Session 28 logged out. Waiting for processes to exit. Mar 25 01:33:34.990888 systemd[1]: sshd@27-172.31.28.242:22-147.75.109.163:53956.service: Deactivated successfully. Mar 25 01:33:34.996932 systemd[1]: session-28.scope: Deactivated successfully. Mar 25 01:33:35.002445 systemd-logind[1929]: Removed session 28. Mar 25 01:33:42.846835 containerd[1946]: time="2025-03-25T01:33:42.846752846Z" level=info msg="TaskExit event in podsandbox handler container_id:\"052a4d3fa764d66533a2a935fb4ec4f8a8b72b4b3d81c172bdd2540d630d54e0\" id:\"c3b1c03e0c9e36b393e30a7791e4a8df6b337673c785266f33ed33d36209dbe5\" pid:5924 exited_at:{seconds:1742866422 nanos:846106561}" Mar 25 01:33:48.473582 systemd[1]: cri-containerd-d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836.scope: Deactivated successfully. Mar 25 01:33:48.475298 systemd[1]: cri-containerd-d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836.scope: Consumed 10.516s CPU time, 44.5M memory peak, 328K read from disk. Mar 25 01:33:48.479073 containerd[1946]: time="2025-03-25T01:33:48.479005666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836\" id:\"d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836\" pid:3775 exit_status:1 exited_at:{seconds:1742866428 nanos:477927289}" Mar 25 01:33:48.480097 containerd[1946]: time="2025-03-25T01:33:48.479321968Z" level=info msg="received exit event container_id:\"d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836\" id:\"d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836\" pid:3775 exit_status:1 exited_at:{seconds:1742866428 nanos:477927289}" Mar 25 01:33:48.523640 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836-rootfs.mount: Deactivated successfully. Mar 25 01:33:48.865741 systemd[1]: cri-containerd-01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f.scope: Deactivated successfully. Mar 25 01:33:48.866353 systemd[1]: cri-containerd-01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f.scope: Consumed 6.588s CPU time, 60.2M memory peak, 152K read from disk. Mar 25 01:33:48.873383 containerd[1946]: time="2025-03-25T01:33:48.872922240Z" level=info msg="received exit event container_id:\"01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f\" id:\"01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f\" pid:3056 exit_status:1 exited_at:{seconds:1742866428 nanos:871301838}" Mar 25 01:33:48.873383 containerd[1946]: time="2025-03-25T01:33:48.873334568Z" level=info msg="TaskExit event in podsandbox handler container_id:\"01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f\" id:\"01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f\" pid:3056 exit_status:1 exited_at:{seconds:1742866428 nanos:871301838}" Mar 25 01:33:48.915528 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f-rootfs.mount: Deactivated successfully. Mar 25 01:33:49.286283 kubelet[3424]: I0325 01:33:49.284509 3424 scope.go:117] "RemoveContainer" containerID="01a45c8da9f4d80c579622db3220c45f0ed725e47ad4baefd2debddaed56d49f" Mar 25 01:33:49.289268 containerd[1946]: time="2025-03-25T01:33:49.288799353Z" level=info msg="CreateContainer within sandbox \"c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 25 01:33:49.292161 kubelet[3424]: I0325 01:33:49.292106 3424 scope.go:117] "RemoveContainer" containerID="d039edf158dc7fbecba0cce38998b7a039642a6c3a70b1e8ad9723aab739e836" Mar 25 01:33:49.296902 containerd[1946]: time="2025-03-25T01:33:49.296666793Z" level=info msg="CreateContainer within sandbox \"1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 25 01:33:49.311040 containerd[1946]: time="2025-03-25T01:33:49.308999109Z" level=info msg="Container 0feab55c0884a6c0aeac41db2d74360944490e72c95c03f364db3ec44b687873: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:49.332213 containerd[1946]: time="2025-03-25T01:33:49.329800221Z" level=info msg="CreateContainer within sandbox \"c580487c457f584f7598a644af0116ee4c4fe528896a445c55ccdf470ef21e42\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"0feab55c0884a6c0aeac41db2d74360944490e72c95c03f364db3ec44b687873\"" Mar 25 01:33:49.333266 containerd[1946]: time="2025-03-25T01:33:49.333128301Z" level=info msg="StartContainer for \"0feab55c0884a6c0aeac41db2d74360944490e72c95c03f364db3ec44b687873\"" Mar 25 01:33:49.338227 containerd[1946]: time="2025-03-25T01:33:49.336653565Z" level=info msg="Container 14c8eb3683a48cab94775ed50c318f1ce4923271c7f806551147673de23bad30: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:49.343942 containerd[1946]: time="2025-03-25T01:33:49.343722045Z" level=info msg="connecting to shim 0feab55c0884a6c0aeac41db2d74360944490e72c95c03f364db3ec44b687873" address="unix:///run/containerd/s/7e8543f5cd1a832894a3cf6101ca27943e42a26b95afd5f1f1342b99b0a35958" protocol=ttrpc version=3 Mar 25 01:33:49.363121 containerd[1946]: time="2025-03-25T01:33:49.363063082Z" level=info msg="CreateContainer within sandbox \"1dcd41ee3d2bfec0727b398251bf7ab9dbcbab7c93cfd2caba0d85c1fcf0f40e\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"14c8eb3683a48cab94775ed50c318f1ce4923271c7f806551147673de23bad30\"" Mar 25 01:33:49.364490 containerd[1946]: time="2025-03-25T01:33:49.364421098Z" level=info msg="StartContainer for \"14c8eb3683a48cab94775ed50c318f1ce4923271c7f806551147673de23bad30\"" Mar 25 01:33:49.366674 containerd[1946]: time="2025-03-25T01:33:49.366618922Z" level=info msg="connecting to shim 14c8eb3683a48cab94775ed50c318f1ce4923271c7f806551147673de23bad30" address="unix:///run/containerd/s/371c5779e91cc7e5da3c74d3ebff377a411752d38d835888bc4a90272e109609" protocol=ttrpc version=3 Mar 25 01:33:49.395973 systemd[1]: Started cri-containerd-0feab55c0884a6c0aeac41db2d74360944490e72c95c03f364db3ec44b687873.scope - libcontainer container 0feab55c0884a6c0aeac41db2d74360944490e72c95c03f364db3ec44b687873. Mar 25 01:33:49.418804 systemd[1]: Started cri-containerd-14c8eb3683a48cab94775ed50c318f1ce4923271c7f806551147673de23bad30.scope - libcontainer container 14c8eb3683a48cab94775ed50c318f1ce4923271c7f806551147673de23bad30. Mar 25 01:33:49.537871 containerd[1946]: time="2025-03-25T01:33:49.537473050Z" level=info msg="StartContainer for \"14c8eb3683a48cab94775ed50c318f1ce4923271c7f806551147673de23bad30\" returns successfully" Mar 25 01:33:49.549061 containerd[1946]: time="2025-03-25T01:33:49.548377810Z" level=info msg="StartContainer for \"0feab55c0884a6c0aeac41db2d74360944490e72c95c03f364db3ec44b687873\" returns successfully" Mar 25 01:33:51.373439 kubelet[3424]: E0325 01:33:51.373197 3424 controller.go:195] "Failed to update lease" err="Put \"https://172.31.28.242:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-28-242?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 25 01:33:53.011014 containerd[1946]: time="2025-03-25T01:33:53.010951476Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\" id:\"4393f24cae74accaccaa1361b8d23b229c6bd7ab1d789fe70bcebf56eef06a44\" pid:6053 exit_status:1 exited_at:{seconds:1742866433 nanos:10359012}" Mar 25 01:33:54.667456 systemd[1]: cri-containerd-e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca.scope: Deactivated successfully. Mar 25 01:33:54.670156 systemd[1]: cri-containerd-e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca.scope: Consumed 4.797s CPU time, 19.5M memory peak, 128K read from disk. Mar 25 01:33:54.676606 containerd[1946]: time="2025-03-25T01:33:54.675800128Z" level=info msg="received exit event container_id:\"e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca\" id:\"e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca\" pid:3086 exit_status:1 exited_at:{seconds:1742866434 nanos:674448304}" Mar 25 01:33:54.678681 containerd[1946]: time="2025-03-25T01:33:54.676454236Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca\" id:\"e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca\" pid:3086 exit_status:1 exited_at:{seconds:1742866434 nanos:674448304}" Mar 25 01:33:54.726334 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca-rootfs.mount: Deactivated successfully. Mar 25 01:33:55.325356 kubelet[3424]: I0325 01:33:55.325173 3424 scope.go:117] "RemoveContainer" containerID="e839f9ea8c4737c68f1078e660150fe56e5cacc6300afc9ec977655856511fca" Mar 25 01:33:55.329962 containerd[1946]: time="2025-03-25T01:33:55.329909391Z" level=info msg="CreateContainer within sandbox \"5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 25 01:33:55.353276 containerd[1946]: time="2025-03-25T01:33:55.348398511Z" level=info msg="Container f3a6d6ab36800fa1543a925581fcc1755f2d9e57d5ad566ec3aae382e558a9b0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:55.379816 containerd[1946]: time="2025-03-25T01:33:55.379761723Z" level=info msg="CreateContainer within sandbox \"5df064e84374e595aaf74ffda83fc21a5d93e57aa244c4b4037b51b38e4723b5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"f3a6d6ab36800fa1543a925581fcc1755f2d9e57d5ad566ec3aae382e558a9b0\"" Mar 25 01:33:55.380824 containerd[1946]: time="2025-03-25T01:33:55.380725347Z" level=info msg="StartContainer for \"f3a6d6ab36800fa1543a925581fcc1755f2d9e57d5ad566ec3aae382e558a9b0\"" Mar 25 01:33:55.383236 containerd[1946]: time="2025-03-25T01:33:55.383138079Z" level=info msg="connecting to shim f3a6d6ab36800fa1543a925581fcc1755f2d9e57d5ad566ec3aae382e558a9b0" address="unix:///run/containerd/s/f57df387c0a5fe00c926d3bbdceb956967472f75174ee638a1388078ca646f86" protocol=ttrpc version=3 Mar 25 01:33:55.420475 systemd[1]: Started cri-containerd-f3a6d6ab36800fa1543a925581fcc1755f2d9e57d5ad566ec3aae382e558a9b0.scope - libcontainer container f3a6d6ab36800fa1543a925581fcc1755f2d9e57d5ad566ec3aae382e558a9b0. Mar 25 01:33:55.513645 containerd[1946]: time="2025-03-25T01:33:55.513544888Z" level=info msg="StartContainer for \"f3a6d6ab36800fa1543a925581fcc1755f2d9e57d5ad566ec3aae382e558a9b0\" returns successfully" Mar 25 01:33:55.799755 containerd[1946]: time="2025-03-25T01:33:55.799559490Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd4ff51f8777fc5d3aff4f1881b5030d0c091d40c4e3df16389226a970da716d\" id:\"109cdc351b0841bedffeee5be98fdd3c9e69289c2adda91c747d92c1fb54fd08\" pid:6118 exit_status:1 exited_at:{seconds:1742866435 nanos:798157854}"