Sep 4 17:51:57.091851 kernel: Linux version 6.6.48-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Wed Sep 4 15:54:07 -00 2024 Sep 4 17:51:57.091895 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=ceda2dd706627da8006bcd6ae77ea155b2a7de6732e2c1c7ab4bed271400663d Sep 4 17:51:57.091908 kernel: BIOS-provided physical RAM map: Sep 4 17:51:57.091916 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 4 17:51:57.091924 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 4 17:51:57.091931 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 4 17:51:57.091941 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdcfff] usable Sep 4 17:51:57.091948 kernel: BIOS-e820: [mem 0x000000007ffdd000-0x000000007fffffff] reserved Sep 4 17:51:57.091956 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 4 17:51:57.091966 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 4 17:51:57.091974 kernel: NX (Execute Disable) protection: active Sep 4 17:51:57.091982 kernel: APIC: Static calls initialized Sep 4 17:51:57.091989 kernel: SMBIOS 2.8 present. Sep 4 17:51:57.091997 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Sep 4 17:51:57.092007 kernel: Hypervisor detected: KVM Sep 4 17:51:57.092017 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 4 17:51:57.092026 kernel: kvm-clock: using sched offset of 4394883851 cycles Sep 4 17:51:57.092034 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 4 17:51:57.092043 kernel: tsc: Detected 1996.249 MHz processor Sep 4 17:51:57.092052 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 17:51:57.092061 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 17:51:57.092069 kernel: last_pfn = 0x7ffdd max_arch_pfn = 0x400000000 Sep 4 17:51:57.092078 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 4 17:51:57.092086 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 17:51:57.092097 kernel: ACPI: Early table checksum verification disabled Sep 4 17:51:57.092106 kernel: ACPI: RSDP 0x00000000000F5930 000014 (v00 BOCHS ) Sep 4 17:51:57.092114 kernel: ACPI: RSDT 0x000000007FFE1848 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:51:57.092123 kernel: ACPI: FACP 0x000000007FFE172C 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:51:57.092131 kernel: ACPI: DSDT 0x000000007FFE0040 0016EC (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:51:57.092139 kernel: ACPI: FACS 0x000000007FFE0000 000040 Sep 4 17:51:57.092148 kernel: ACPI: APIC 0x000000007FFE17A0 000080 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:51:57.092156 kernel: ACPI: WAET 0x000000007FFE1820 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:51:57.092165 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe172c-0x7ffe179f] Sep 4 17:51:57.092175 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffe0040-0x7ffe172b] Sep 4 17:51:57.092184 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffe0000-0x7ffe003f] Sep 4 17:51:57.092192 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe17a0-0x7ffe181f] Sep 4 17:51:57.092201 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe1820-0x7ffe1847] Sep 4 17:51:57.092209 kernel: No NUMA configuration found Sep 4 17:51:57.092217 kernel: Faking a node at [mem 0x0000000000000000-0x000000007ffdcfff] Sep 4 17:51:57.092226 kernel: NODE_DATA(0) allocated [mem 0x7ffd7000-0x7ffdcfff] Sep 4 17:51:57.092238 kernel: Zone ranges: Sep 4 17:51:57.092249 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 17:51:57.094290 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdcfff] Sep 4 17:51:57.094301 kernel: Normal empty Sep 4 17:51:57.094310 kernel: Movable zone start for each node Sep 4 17:51:57.094318 kernel: Early memory node ranges Sep 4 17:51:57.094327 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 4 17:51:57.094335 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdcfff] Sep 4 17:51:57.094347 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdcfff] Sep 4 17:51:57.094356 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 17:51:57.094364 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 4 17:51:57.094372 kernel: On node 0, zone DMA32: 35 pages in unavailable ranges Sep 4 17:51:57.094380 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 4 17:51:57.094388 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 4 17:51:57.094397 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 4 17:51:57.094405 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 4 17:51:57.094413 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 4 17:51:57.094424 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 4 17:51:57.094432 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 4 17:51:57.094441 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 4 17:51:57.094449 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 17:51:57.094457 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Sep 4 17:51:57.094465 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 4 17:51:57.094473 kernel: [mem 0x80000000-0xfeffbfff] available for PCI devices Sep 4 17:51:57.094481 kernel: Booting paravirtualized kernel on KVM Sep 4 17:51:57.094490 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 17:51:57.094500 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 4 17:51:57.094508 kernel: percpu: Embedded 58 pages/cpu s196904 r8192 d32472 u1048576 Sep 4 17:51:57.094517 kernel: pcpu-alloc: s196904 r8192 d32472 u1048576 alloc=1*2097152 Sep 4 17:51:57.094525 kernel: pcpu-alloc: [0] 0 1 Sep 4 17:51:57.094533 kernel: kvm-guest: PV spinlocks disabled, no host support Sep 4 17:51:57.094543 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=ceda2dd706627da8006bcd6ae77ea155b2a7de6732e2c1c7ab4bed271400663d Sep 4 17:51:57.094552 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 17:51:57.094560 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 17:51:57.094570 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 4 17:51:57.094579 kernel: Fallback order for Node 0: 0 Sep 4 17:51:57.094587 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515805 Sep 4 17:51:57.094595 kernel: Policy zone: DMA32 Sep 4 17:51:57.094603 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 17:51:57.094611 kernel: Memory: 1971212K/2096620K available (12288K kernel code, 2304K rwdata, 22708K rodata, 42704K init, 2488K bss, 125148K reserved, 0K cma-reserved) Sep 4 17:51:57.094620 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 4 17:51:57.094628 kernel: ftrace: allocating 37748 entries in 148 pages Sep 4 17:51:57.094638 kernel: ftrace: allocated 148 pages with 3 groups Sep 4 17:51:57.094646 kernel: Dynamic Preempt: voluntary Sep 4 17:51:57.094654 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 17:51:57.094663 kernel: rcu: RCU event tracing is enabled. Sep 4 17:51:57.094683 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 4 17:51:57.094691 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 17:51:57.094699 kernel: Rude variant of Tasks RCU enabled. Sep 4 17:51:57.094708 kernel: Tracing variant of Tasks RCU enabled. Sep 4 17:51:57.094716 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 17:51:57.094724 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 4 17:51:57.094735 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 4 17:51:57.094745 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 4 17:51:57.094753 kernel: Console: colour VGA+ 80x25 Sep 4 17:51:57.094762 kernel: printk: console [tty0] enabled Sep 4 17:51:57.094771 kernel: printk: console [ttyS0] enabled Sep 4 17:51:57.094779 kernel: ACPI: Core revision 20230628 Sep 4 17:51:57.094788 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 17:51:57.094797 kernel: x2apic enabled Sep 4 17:51:57.094806 kernel: APIC: Switched APIC routing to: physical x2apic Sep 4 17:51:57.094817 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 4 17:51:57.094825 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Sep 4 17:51:57.094834 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Sep 4 17:51:57.094843 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Sep 4 17:51:57.094852 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Sep 4 17:51:57.094861 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 17:51:57.094869 kernel: Spectre V2 : Mitigation: Retpolines Sep 4 17:51:57.094878 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Sep 4 17:51:57.094887 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Sep 4 17:51:57.094898 kernel: Speculative Store Bypass: Vulnerable Sep 4 17:51:57.094906 kernel: x86/fpu: x87 FPU will use FXSAVE Sep 4 17:51:57.094915 kernel: Freeing SMP alternatives memory: 32K Sep 4 17:51:57.094924 kernel: pid_max: default: 32768 minimum: 301 Sep 4 17:51:57.094933 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 4 17:51:57.094941 kernel: landlock: Up and running. Sep 4 17:51:57.094950 kernel: SELinux: Initializing. Sep 4 17:51:57.094959 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 4 17:51:57.094976 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 4 17:51:57.094986 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Sep 4 17:51:57.094995 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Sep 4 17:51:57.095004 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Sep 4 17:51:57.095015 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Sep 4 17:51:57.095025 kernel: Performance Events: AMD PMU driver. Sep 4 17:51:57.095034 kernel: ... version: 0 Sep 4 17:51:57.095043 kernel: ... bit width: 48 Sep 4 17:51:57.095052 kernel: ... generic registers: 4 Sep 4 17:51:57.095064 kernel: ... value mask: 0000ffffffffffff Sep 4 17:51:57.095073 kernel: ... max period: 00007fffffffffff Sep 4 17:51:57.095082 kernel: ... fixed-purpose events: 0 Sep 4 17:51:57.095091 kernel: ... event mask: 000000000000000f Sep 4 17:51:57.095100 kernel: signal: max sigframe size: 1440 Sep 4 17:51:57.095110 kernel: rcu: Hierarchical SRCU implementation. Sep 4 17:51:57.095119 kernel: rcu: Max phase no-delay instances is 400. Sep 4 17:51:57.095128 kernel: smp: Bringing up secondary CPUs ... Sep 4 17:51:57.095137 kernel: smpboot: x86: Booting SMP configuration: Sep 4 17:51:57.095149 kernel: .... node #0, CPUs: #1 Sep 4 17:51:57.095158 kernel: smp: Brought up 1 node, 2 CPUs Sep 4 17:51:57.095167 kernel: smpboot: Max logical packages: 2 Sep 4 17:51:57.095177 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Sep 4 17:51:57.095186 kernel: devtmpfs: initialized Sep 4 17:51:57.095195 kernel: x86/mm: Memory block size: 128MB Sep 4 17:51:57.095204 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 17:51:57.095214 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 4 17:51:57.095223 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 17:51:57.095234 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 17:51:57.095244 kernel: audit: initializing netlink subsys (disabled) Sep 4 17:51:57.095266 kernel: audit: type=2000 audit(1725472316.918:1): state=initialized audit_enabled=0 res=1 Sep 4 17:51:57.097297 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 17:51:57.097309 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 17:51:57.097318 kernel: cpuidle: using governor menu Sep 4 17:51:57.097327 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 17:51:57.097335 kernel: dca service started, version 1.12.1 Sep 4 17:51:57.097344 kernel: PCI: Using configuration type 1 for base access Sep 4 17:51:57.097356 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 17:51:57.097365 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 17:51:57.097374 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 17:51:57.097382 kernel: ACPI: Added _OSI(Module Device) Sep 4 17:51:57.097391 kernel: ACPI: Added _OSI(Processor Device) Sep 4 17:51:57.097400 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Sep 4 17:51:57.097409 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 17:51:57.097418 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 4 17:51:57.097427 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Sep 4 17:51:57.097438 kernel: ACPI: Interpreter enabled Sep 4 17:51:57.097447 kernel: ACPI: PM: (supports S0 S3 S5) Sep 4 17:51:57.097455 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 17:51:57.097464 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 17:51:57.097473 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 17:51:57.097481 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Sep 4 17:51:57.097490 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 4 17:51:57.097654 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Sep 4 17:51:57.097757 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Sep 4 17:51:57.097847 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Sep 4 17:51:57.097860 kernel: acpiphp: Slot [3] registered Sep 4 17:51:57.097869 kernel: acpiphp: Slot [4] registered Sep 4 17:51:57.097878 kernel: acpiphp: Slot [5] registered Sep 4 17:51:57.097886 kernel: acpiphp: Slot [6] registered Sep 4 17:51:57.097895 kernel: acpiphp: Slot [7] registered Sep 4 17:51:57.097903 kernel: acpiphp: Slot [8] registered Sep 4 17:51:57.097915 kernel: acpiphp: Slot [9] registered Sep 4 17:51:57.097923 kernel: acpiphp: Slot [10] registered Sep 4 17:51:57.097932 kernel: acpiphp: Slot [11] registered Sep 4 17:51:57.097941 kernel: acpiphp: Slot [12] registered Sep 4 17:51:57.097949 kernel: acpiphp: Slot [13] registered Sep 4 17:51:57.097957 kernel: acpiphp: Slot [14] registered Sep 4 17:51:57.097966 kernel: acpiphp: Slot [15] registered Sep 4 17:51:57.097974 kernel: acpiphp: Slot [16] registered Sep 4 17:51:57.097983 kernel: acpiphp: Slot [17] registered Sep 4 17:51:57.097991 kernel: acpiphp: Slot [18] registered Sep 4 17:51:57.098002 kernel: acpiphp: Slot [19] registered Sep 4 17:51:57.098010 kernel: acpiphp: Slot [20] registered Sep 4 17:51:57.098019 kernel: acpiphp: Slot [21] registered Sep 4 17:51:57.098027 kernel: acpiphp: Slot [22] registered Sep 4 17:51:57.098036 kernel: acpiphp: Slot [23] registered Sep 4 17:51:57.098045 kernel: acpiphp: Slot [24] registered Sep 4 17:51:57.098053 kernel: acpiphp: Slot [25] registered Sep 4 17:51:57.098061 kernel: acpiphp: Slot [26] registered Sep 4 17:51:57.098070 kernel: acpiphp: Slot [27] registered Sep 4 17:51:57.098080 kernel: acpiphp: Slot [28] registered Sep 4 17:51:57.098089 kernel: acpiphp: Slot [29] registered Sep 4 17:51:57.098097 kernel: acpiphp: Slot [30] registered Sep 4 17:51:57.098106 kernel: acpiphp: Slot [31] registered Sep 4 17:51:57.098114 kernel: PCI host bridge to bus 0000:00 Sep 4 17:51:57.098206 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 17:51:57.098314 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 4 17:51:57.098400 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 17:51:57.098489 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Sep 4 17:51:57.098571 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] Sep 4 17:51:57.098652 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 4 17:51:57.098797 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Sep 4 17:51:57.098907 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Sep 4 17:51:57.099013 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Sep 4 17:51:57.099117 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Sep 4 17:51:57.099214 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Sep 4 17:51:57.100354 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Sep 4 17:51:57.100452 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Sep 4 17:51:57.100542 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Sep 4 17:51:57.100642 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Sep 4 17:51:57.100732 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Sep 4 17:51:57.100827 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Sep 4 17:51:57.100927 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Sep 4 17:51:57.101018 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Sep 4 17:51:57.101108 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Sep 4 17:51:57.101199 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Sep 4 17:51:57.102346 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Sep 4 17:51:57.102446 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 17:51:57.102549 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Sep 4 17:51:57.102639 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Sep 4 17:51:57.102743 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Sep 4 17:51:57.102833 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Sep 4 17:51:57.102922 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Sep 4 17:51:57.103018 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Sep 4 17:51:57.103108 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Sep 4 17:51:57.103205 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Sep 4 17:51:57.108171 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Sep 4 17:51:57.108313 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Sep 4 17:51:57.108416 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Sep 4 17:51:57.108513 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Sep 4 17:51:57.108621 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Sep 4 17:51:57.108719 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Sep 4 17:51:57.108822 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Sep 4 17:51:57.108836 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 4 17:51:57.108846 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 4 17:51:57.108855 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 4 17:51:57.108865 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 4 17:51:57.108874 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Sep 4 17:51:57.108884 kernel: iommu: Default domain type: Translated Sep 4 17:51:57.108893 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 17:51:57.108906 kernel: PCI: Using ACPI for IRQ routing Sep 4 17:51:57.108915 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 17:51:57.108924 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 4 17:51:57.108933 kernel: e820: reserve RAM buffer [mem 0x7ffdd000-0x7fffffff] Sep 4 17:51:57.109028 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Sep 4 17:51:57.109123 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Sep 4 17:51:57.109218 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 17:51:57.109232 kernel: vgaarb: loaded Sep 4 17:51:57.109241 kernel: clocksource: Switched to clocksource kvm-clock Sep 4 17:51:57.109281 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 17:51:57.109292 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 17:51:57.109301 kernel: pnp: PnP ACPI init Sep 4 17:51:57.109404 kernel: pnp 00:03: [dma 2] Sep 4 17:51:57.109420 kernel: pnp: PnP ACPI: found 5 devices Sep 4 17:51:57.109429 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 17:51:57.109439 kernel: NET: Registered PF_INET protocol family Sep 4 17:51:57.109449 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 4 17:51:57.109463 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 4 17:51:57.109473 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 17:51:57.109482 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 4 17:51:57.109492 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 4 17:51:57.109502 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 4 17:51:57.109511 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 4 17:51:57.109521 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 4 17:51:57.109530 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 17:51:57.109539 kernel: NET: Registered PF_XDP protocol family Sep 4 17:51:57.109628 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 4 17:51:57.109724 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 4 17:51:57.109810 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 4 17:51:57.109909 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Sep 4 17:51:57.110012 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x17fffffff window] Sep 4 17:51:57.110110 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Sep 4 17:51:57.110208 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 4 17:51:57.110223 kernel: PCI: CLS 0 bytes, default 64 Sep 4 17:51:57.110237 kernel: Initialise system trusted keyrings Sep 4 17:51:57.110247 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 4 17:51:57.110290 kernel: Key type asymmetric registered Sep 4 17:51:57.110301 kernel: Asymmetric key parser 'x509' registered Sep 4 17:51:57.110310 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Sep 4 17:51:57.110320 kernel: io scheduler mq-deadline registered Sep 4 17:51:57.110329 kernel: io scheduler kyber registered Sep 4 17:51:57.110338 kernel: io scheduler bfq registered Sep 4 17:51:57.110348 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 17:51:57.110362 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Sep 4 17:51:57.110371 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Sep 4 17:51:57.110381 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Sep 4 17:51:57.110391 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Sep 4 17:51:57.110400 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 17:51:57.110410 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 17:51:57.110419 kernel: random: crng init done Sep 4 17:51:57.110429 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 4 17:51:57.110438 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 4 17:51:57.110450 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 4 17:51:57.110459 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 4 17:51:57.110569 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 4 17:51:57.110700 kernel: rtc_cmos 00:04: registered as rtc0 Sep 4 17:51:57.110796 kernel: rtc_cmos 00:04: setting system clock to 2024-09-04T17:51:56 UTC (1725472316) Sep 4 17:51:57.110887 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 4 17:51:57.110901 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 4 17:51:57.110911 kernel: NET: Registered PF_INET6 protocol family Sep 4 17:51:57.110924 kernel: Segment Routing with IPv6 Sep 4 17:51:57.110934 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 17:51:57.110943 kernel: NET: Registered PF_PACKET protocol family Sep 4 17:51:57.110953 kernel: Key type dns_resolver registered Sep 4 17:51:57.110962 kernel: IPI shorthand broadcast: enabled Sep 4 17:51:57.110972 kernel: sched_clock: Marking stable (859008295, 121755109)->(993075012, -12311608) Sep 4 17:51:57.110981 kernel: registered taskstats version 1 Sep 4 17:51:57.110991 kernel: Loading compiled-in X.509 certificates Sep 4 17:51:57.111000 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.48-flatcar: 8669771ab5e11f458b79e6634fe685dacc266b18' Sep 4 17:51:57.111011 kernel: Key type .fscrypt registered Sep 4 17:51:57.111021 kernel: Key type fscrypt-provisioning registered Sep 4 17:51:57.111031 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 4 17:51:57.111040 kernel: ima: Allocated hash algorithm: sha1 Sep 4 17:51:57.111049 kernel: ima: No architecture policies found Sep 4 17:51:57.111059 kernel: clk: Disabling unused clocks Sep 4 17:51:57.111068 kernel: Freeing unused kernel image (initmem) memory: 42704K Sep 4 17:51:57.111078 kernel: Write protecting the kernel read-only data: 36864k Sep 4 17:51:57.111089 kernel: Freeing unused kernel image (rodata/data gap) memory: 1868K Sep 4 17:51:57.111099 kernel: Run /init as init process Sep 4 17:51:57.111108 kernel: with arguments: Sep 4 17:51:57.111117 kernel: /init Sep 4 17:51:57.111126 kernel: with environment: Sep 4 17:51:57.111135 kernel: HOME=/ Sep 4 17:51:57.111144 kernel: TERM=linux Sep 4 17:51:57.111153 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 17:51:57.111173 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 4 17:51:57.111189 systemd[1]: Detected virtualization kvm. Sep 4 17:51:57.111200 systemd[1]: Detected architecture x86-64. Sep 4 17:51:57.111209 systemd[1]: Running in initrd. Sep 4 17:51:57.111220 systemd[1]: No hostname configured, using default hostname. Sep 4 17:51:57.111229 systemd[1]: Hostname set to . Sep 4 17:51:57.111240 systemd[1]: Initializing machine ID from VM UUID. Sep 4 17:51:57.111250 systemd[1]: Queued start job for default target initrd.target. Sep 4 17:51:57.111552 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 17:51:57.111566 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 17:51:57.111578 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 17:51:57.111590 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 17:51:57.111601 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 17:51:57.111613 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 17:51:57.111627 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 17:51:57.111641 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 17:51:57.111653 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 17:51:57.111664 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 17:51:57.111675 systemd[1]: Reached target paths.target - Path Units. Sep 4 17:51:57.111697 systemd[1]: Reached target slices.target - Slice Units. Sep 4 17:51:57.111711 systemd[1]: Reached target swap.target - Swaps. Sep 4 17:51:57.111724 systemd[1]: Reached target timers.target - Timer Units. Sep 4 17:51:57.111736 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 17:51:57.111747 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 17:51:57.111759 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 17:51:57.111771 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 4 17:51:57.111782 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 17:51:57.111794 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 17:51:57.111806 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 17:51:57.111818 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 17:51:57.111832 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 17:51:57.111844 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 17:51:57.111856 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 17:51:57.111867 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 17:51:57.111879 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 17:51:57.111890 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 17:51:57.111902 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:51:57.111914 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 17:51:57.111927 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 17:51:57.111939 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 17:51:57.111971 systemd-journald[184]: Collecting audit messages is disabled. Sep 4 17:51:57.112001 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 17:51:57.112013 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 17:51:57.112027 systemd-journald[184]: Journal started Sep 4 17:51:57.112053 systemd-journald[184]: Runtime Journal (/run/log/journal/fe0e33dedfb845d2b36d2f8bf276f3fe) is 4.9M, max 39.3M, 34.4M free. Sep 4 17:51:57.077365 systemd-modules-load[185]: Inserted module 'overlay' Sep 4 17:51:57.147687 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 17:51:57.147745 kernel: Bridge firewalling registered Sep 4 17:51:57.126958 systemd-modules-load[185]: Inserted module 'br_netfilter' Sep 4 17:51:57.150277 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 17:51:57.150975 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 17:51:57.151728 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:51:57.159498 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 17:51:57.162402 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 17:51:57.164518 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 17:51:57.168891 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 17:51:57.185332 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 17:51:57.190497 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 17:51:57.192395 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 17:51:57.207514 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 17:51:57.208934 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 17:51:57.211414 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 17:51:57.228587 dracut-cmdline[221]: dracut-dracut-053 Sep 4 17:51:57.232066 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=ceda2dd706627da8006bcd6ae77ea155b2a7de6732e2c1c7ab4bed271400663d Sep 4 17:51:57.239794 systemd-resolved[218]: Positive Trust Anchors: Sep 4 17:51:57.239831 systemd-resolved[218]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 17:51:57.239878 systemd-resolved[218]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 17:51:57.243935 systemd-resolved[218]: Defaulting to hostname 'linux'. Sep 4 17:51:57.245190 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 17:51:57.246098 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 17:51:57.330340 kernel: SCSI subsystem initialized Sep 4 17:51:57.340346 kernel: Loading iSCSI transport class v2.0-870. Sep 4 17:51:57.353328 kernel: iscsi: registered transport (tcp) Sep 4 17:51:57.379304 kernel: iscsi: registered transport (qla4xxx) Sep 4 17:51:57.379624 kernel: QLogic iSCSI HBA Driver Sep 4 17:51:57.423628 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 17:51:57.432459 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 17:51:57.458443 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 17:51:57.458500 kernel: device-mapper: uevent: version 1.0.3 Sep 4 17:51:57.460243 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 4 17:51:57.523488 kernel: raid6: sse2x4 gen() 5217 MB/s Sep 4 17:51:57.540484 kernel: raid6: sse2x2 gen() 8245 MB/s Sep 4 17:51:57.557606 kernel: raid6: sse2x1 gen() 9638 MB/s Sep 4 17:51:57.557672 kernel: raid6: using algorithm sse2x1 gen() 9638 MB/s Sep 4 17:51:57.575596 kernel: raid6: .... xor() 7333 MB/s, rmw enabled Sep 4 17:51:57.575679 kernel: raid6: using ssse3x2 recovery algorithm Sep 4 17:51:57.598352 kernel: xor: measuring software checksum speed Sep 4 17:51:57.599321 kernel: prefetch64-sse : 18522 MB/sec Sep 4 17:51:57.601765 kernel: generic_sse : 15782 MB/sec Sep 4 17:51:57.601843 kernel: xor: using function: prefetch64-sse (18522 MB/sec) Sep 4 17:51:57.778338 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 17:51:57.797312 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 17:51:57.807572 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 17:51:57.852739 systemd-udevd[403]: Using default interface naming scheme 'v255'. Sep 4 17:51:57.864289 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 17:51:57.874556 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 17:51:57.906075 dracut-pre-trigger[410]: rd.md=0: removing MD RAID activation Sep 4 17:51:57.946563 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 17:51:57.953476 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 17:51:58.030758 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 17:51:58.041521 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 17:51:58.056949 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 17:51:58.074183 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 17:51:58.075201 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 17:51:58.078678 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 17:51:58.089521 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 17:51:58.116687 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 17:51:58.124445 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Sep 4 17:51:58.133281 kernel: virtio_blk virtio2: [vda] 41943040 512-byte logical blocks (21.5 GB/20.0 GiB) Sep 4 17:51:58.144905 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 4 17:51:58.144958 kernel: GPT:17805311 != 41943039 Sep 4 17:51:58.144971 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 4 17:51:58.144983 kernel: GPT:17805311 != 41943039 Sep 4 17:51:58.145007 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 4 17:51:58.145019 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 17:51:58.147484 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 17:51:58.148397 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 17:51:58.150735 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 17:51:58.151547 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 17:51:58.151689 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:51:58.158116 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:51:58.165669 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:51:58.179629 kernel: libata version 3.00 loaded. Sep 4 17:51:58.183394 kernel: ata_piix 0000:00:01.1: version 2.13 Sep 4 17:51:58.192469 kernel: scsi host0: ata_piix Sep 4 17:51:58.198285 kernel: BTRFS: device fsid 0dc40443-7f77-4fa7-b5e4-579d4bba0772 devid 1 transid 37 /dev/vda3 scanned by (udev-worker) (454) Sep 4 17:51:58.200643 kernel: scsi host1: ata_piix Sep 4 17:51:58.200805 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Sep 4 17:51:58.200819 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Sep 4 17:51:58.204294 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (453) Sep 4 17:51:58.211454 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 4 17:51:58.251382 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 4 17:51:58.252426 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:51:58.258092 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 4 17:51:58.258686 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 4 17:51:58.265244 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 17:51:58.271389 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 17:51:58.274395 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 17:51:58.285943 disk-uuid[499]: Primary Header is updated. Sep 4 17:51:58.285943 disk-uuid[499]: Secondary Entries is updated. Sep 4 17:51:58.285943 disk-uuid[499]: Secondary Header is updated. Sep 4 17:51:58.294048 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 17:51:58.297291 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 17:51:58.302333 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 17:51:59.314410 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 17:51:59.317146 disk-uuid[505]: The operation has completed successfully. Sep 4 17:51:59.384407 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 17:51:59.384702 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 17:51:59.417405 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 17:51:59.439778 sh[522]: Success Sep 4 17:51:59.460296 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Sep 4 17:51:59.532362 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 17:51:59.543472 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 17:51:59.549145 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 17:51:59.599726 kernel: BTRFS info (device dm-0): first mount of filesystem 0dc40443-7f77-4fa7-b5e4-579d4bba0772 Sep 4 17:51:59.599825 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:51:59.602359 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 4 17:51:59.604522 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 17:51:59.606316 kernel: BTRFS info (device dm-0): using free space tree Sep 4 17:51:59.620759 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 17:51:59.621818 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 4 17:51:59.628578 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 17:51:59.632426 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 17:51:59.653629 kernel: BTRFS info (device vda6): first mount of filesystem b2463ce1-c756-4e78-b7f2-401dad24571d Sep 4 17:51:59.653712 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:51:59.653741 kernel: BTRFS info (device vda6): using free space tree Sep 4 17:51:59.661317 kernel: BTRFS info (device vda6): auto enabling async discard Sep 4 17:51:59.676500 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 4 17:51:59.678047 kernel: BTRFS info (device vda6): last unmount of filesystem b2463ce1-c756-4e78-b7f2-401dad24571d Sep 4 17:51:59.692765 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 17:51:59.702053 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 17:51:59.765114 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 17:51:59.770336 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 17:51:59.799027 systemd-networkd[704]: lo: Link UP Sep 4 17:51:59.799037 systemd-networkd[704]: lo: Gained carrier Sep 4 17:51:59.800247 systemd-networkd[704]: Enumeration completed Sep 4 17:51:59.800601 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 17:51:59.800848 systemd-networkd[704]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:51:59.800852 systemd-networkd[704]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 17:51:59.801416 systemd[1]: Reached target network.target - Network. Sep 4 17:51:59.802910 systemd-networkd[704]: eth0: Link UP Sep 4 17:51:59.802914 systemd-networkd[704]: eth0: Gained carrier Sep 4 17:51:59.802924 systemd-networkd[704]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:51:59.821499 systemd-networkd[704]: eth0: DHCPv4 address 172.24.4.122/24, gateway 172.24.4.1 acquired from 172.24.4.1 Sep 4 17:51:59.855745 ignition[649]: Ignition 2.19.0 Sep 4 17:51:59.855757 ignition[649]: Stage: fetch-offline Sep 4 17:51:59.857837 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 17:51:59.855798 ignition[649]: no configs at "/usr/lib/ignition/base.d" Sep 4 17:51:59.855808 ignition[649]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 4 17:51:59.855918 ignition[649]: parsed url from cmdline: "" Sep 4 17:51:59.855922 ignition[649]: no config URL provided Sep 4 17:51:59.855928 ignition[649]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 17:51:59.855937 ignition[649]: no config at "/usr/lib/ignition/user.ign" Sep 4 17:51:59.855942 ignition[649]: failed to fetch config: resource requires networking Sep 4 17:51:59.856139 ignition[649]: Ignition finished successfully Sep 4 17:51:59.863757 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 4 17:51:59.880385 ignition[714]: Ignition 2.19.0 Sep 4 17:51:59.880401 ignition[714]: Stage: fetch Sep 4 17:51:59.880605 ignition[714]: no configs at "/usr/lib/ignition/base.d" Sep 4 17:51:59.880617 ignition[714]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 4 17:51:59.880727 ignition[714]: parsed url from cmdline: "" Sep 4 17:51:59.880731 ignition[714]: no config URL provided Sep 4 17:51:59.880737 ignition[714]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 17:51:59.880745 ignition[714]: no config at "/usr/lib/ignition/user.ign" Sep 4 17:51:59.880871 ignition[714]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Sep 4 17:51:59.881215 ignition[714]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Sep 4 17:51:59.881285 ignition[714]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Sep 4 17:52:00.215450 ignition[714]: GET result: OK Sep 4 17:52:00.215643 ignition[714]: parsing config with SHA512: 8012bb254c79f056fc5fa29cdda2f46ab176c58ef3e19e31931eabde6181a8c896a1e160fba7d1863a9bd4f8a45a8b8da59ab65273ff715f356f806a61af607e Sep 4 17:52:00.225533 unknown[714]: fetched base config from "system" Sep 4 17:52:00.225561 unknown[714]: fetched base config from "system" Sep 4 17:52:00.226444 ignition[714]: fetch: fetch complete Sep 4 17:52:00.225577 unknown[714]: fetched user config from "openstack" Sep 4 17:52:00.226456 ignition[714]: fetch: fetch passed Sep 4 17:52:00.230349 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 4 17:52:00.226547 ignition[714]: Ignition finished successfully Sep 4 17:52:00.239704 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 17:52:00.277387 ignition[720]: Ignition 2.19.0 Sep 4 17:52:00.278999 ignition[720]: Stage: kargs Sep 4 17:52:00.280510 ignition[720]: no configs at "/usr/lib/ignition/base.d" Sep 4 17:52:00.281733 ignition[720]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 4 17:52:00.285453 ignition[720]: kargs: kargs passed Sep 4 17:52:00.286654 ignition[720]: Ignition finished successfully Sep 4 17:52:00.290112 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 17:52:00.299516 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 17:52:00.331413 ignition[726]: Ignition 2.19.0 Sep 4 17:52:00.332399 ignition[726]: Stage: disks Sep 4 17:52:00.332812 ignition[726]: no configs at "/usr/lib/ignition/base.d" Sep 4 17:52:00.332836 ignition[726]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 4 17:52:00.341682 ignition[726]: disks: disks passed Sep 4 17:52:00.343194 ignition[726]: Ignition finished successfully Sep 4 17:52:00.345392 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 17:52:00.347741 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 17:52:00.349166 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 17:52:00.351129 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 17:52:00.353014 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 17:52:00.354839 systemd[1]: Reached target basic.target - Basic System. Sep 4 17:52:00.362521 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 17:52:00.388579 systemd-fsck[734]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 4 17:52:00.400031 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 17:52:00.409492 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 17:52:00.572122 kernel: EXT4-fs (vda9): mounted filesystem bdbe0f61-2675-40b7-b9ae-5653402e9b23 r/w with ordered data mode. Quota mode: none. Sep 4 17:52:00.572966 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 17:52:00.574050 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 17:52:00.582347 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 17:52:00.589427 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 17:52:00.591353 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 4 17:52:00.595210 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Sep 4 17:52:00.596633 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 17:52:00.597542 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 17:52:00.601479 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 17:52:00.611311 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by mount (742) Sep 4 17:52:00.612467 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 17:52:00.623582 kernel: BTRFS info (device vda6): first mount of filesystem b2463ce1-c756-4e78-b7f2-401dad24571d Sep 4 17:52:00.623607 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:52:00.623620 kernel: BTRFS info (device vda6): using free space tree Sep 4 17:52:00.637295 kernel: BTRFS info (device vda6): auto enabling async discard Sep 4 17:52:00.640969 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 17:52:00.717859 initrd-setup-root[769]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 17:52:00.728015 initrd-setup-root[776]: cut: /sysroot/etc/group: No such file or directory Sep 4 17:52:00.737165 initrd-setup-root[784]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 17:52:00.748893 initrd-setup-root[791]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 17:52:00.852656 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 17:52:00.861339 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 17:52:00.864417 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 17:52:00.873293 kernel: BTRFS info (device vda6): last unmount of filesystem b2463ce1-c756-4e78-b7f2-401dad24571d Sep 4 17:52:00.874104 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 17:52:00.905885 ignition[858]: INFO : Ignition 2.19.0 Sep 4 17:52:00.906901 ignition[858]: INFO : Stage: mount Sep 4 17:52:00.906901 ignition[858]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 17:52:00.906901 ignition[858]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 4 17:52:00.909766 ignition[858]: INFO : mount: mount passed Sep 4 17:52:00.909766 ignition[858]: INFO : Ignition finished successfully Sep 4 17:52:00.910213 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 17:52:00.911085 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 17:52:01.752861 systemd-networkd[704]: eth0: Gained IPv6LL Sep 4 17:52:07.828427 coreos-metadata[744]: Sep 04 17:52:07.828 WARN failed to locate config-drive, using the metadata service API instead Sep 4 17:52:07.869387 coreos-metadata[744]: Sep 04 17:52:07.869 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 4 17:52:07.885788 coreos-metadata[744]: Sep 04 17:52:07.885 INFO Fetch successful Sep 4 17:52:07.887251 coreos-metadata[744]: Sep 04 17:52:07.886 INFO wrote hostname ci-4054-1-0-c-33e05803e0.novalocal to /sysroot/etc/hostname Sep 4 17:52:07.889954 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Sep 4 17:52:07.890206 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Sep 4 17:52:07.902502 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 17:52:07.938743 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 17:52:07.955330 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (875) Sep 4 17:52:07.964007 kernel: BTRFS info (device vda6): first mount of filesystem b2463ce1-c756-4e78-b7f2-401dad24571d Sep 4 17:52:07.964071 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:52:07.967011 kernel: BTRFS info (device vda6): using free space tree Sep 4 17:52:07.976340 kernel: BTRFS info (device vda6): auto enabling async discard Sep 4 17:52:07.981719 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 17:52:08.026118 ignition[893]: INFO : Ignition 2.19.0 Sep 4 17:52:08.026118 ignition[893]: INFO : Stage: files Sep 4 17:52:08.029018 ignition[893]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 17:52:08.029018 ignition[893]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 4 17:52:08.029018 ignition[893]: DEBUG : files: compiled without relabeling support, skipping Sep 4 17:52:08.035367 ignition[893]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 17:52:08.035367 ignition[893]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 17:52:08.040743 ignition[893]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 17:52:08.040743 ignition[893]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 17:52:08.044504 ignition[893]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 17:52:08.040923 unknown[893]: wrote ssh authorized keys file for user: core Sep 4 17:52:08.048420 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 17:52:08.048420 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 4 17:52:08.130978 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 17:52:08.446849 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 17:52:08.446849 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.28.7-x86-64.raw" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.28.7-x86-64.raw" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.28.7-x86-64.raw" Sep 4 17:52:08.450341 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.28.7-x86-64.raw: attempt #1 Sep 4 17:52:08.984036 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 17:52:10.516548 ignition[893]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.28.7-x86-64.raw" Sep 4 17:52:10.516548 ignition[893]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 4 17:52:10.521929 ignition[893]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 17:52:10.521929 ignition[893]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 17:52:10.521929 ignition[893]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 4 17:52:10.521929 ignition[893]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 4 17:52:10.521929 ignition[893]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 17:52:10.521929 ignition[893]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 17:52:10.521929 ignition[893]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 17:52:10.521929 ignition[893]: INFO : files: files passed Sep 4 17:52:10.521929 ignition[893]: INFO : Ignition finished successfully Sep 4 17:52:10.521545 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 17:52:10.533550 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 17:52:10.537420 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 17:52:10.545224 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 17:52:10.545505 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 17:52:10.554244 initrd-setup-root-after-ignition[922]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 17:52:10.554244 initrd-setup-root-after-ignition[922]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 17:52:10.557898 initrd-setup-root-after-ignition[926]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 17:52:10.560661 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 17:52:10.563614 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 17:52:10.572569 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 17:52:10.604136 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 17:52:10.604440 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 17:52:10.606532 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 17:52:10.607994 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 17:52:10.609985 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 17:52:10.614498 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 17:52:10.641605 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 17:52:10.650505 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 17:52:10.669149 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 17:52:10.670430 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 17:52:10.673463 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 17:52:10.676070 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 17:52:10.676400 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 17:52:10.679469 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 17:52:10.681305 systemd[1]: Stopped target basic.target - Basic System. Sep 4 17:52:10.684030 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 17:52:10.686567 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 17:52:10.688981 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 17:52:10.691797 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 17:52:10.694637 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 17:52:10.697482 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 17:52:10.700150 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 17:52:10.702817 systemd[1]: Stopped target swap.target - Swaps. Sep 4 17:52:10.705304 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 17:52:10.705596 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 17:52:10.708634 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 17:52:10.710540 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 17:52:10.712992 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 17:52:10.715305 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 17:52:10.717504 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 17:52:10.717801 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 17:52:10.721186 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 17:52:10.721551 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 17:52:10.724857 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 17:52:10.725127 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 17:52:10.735736 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 17:52:10.736818 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 17:52:10.737148 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 17:52:10.743540 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 17:52:10.744489 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 17:52:10.744673 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 17:52:10.752649 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 17:52:10.752794 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 17:52:10.761002 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 17:52:10.761094 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 17:52:10.769619 ignition[946]: INFO : Ignition 2.19.0 Sep 4 17:52:10.769619 ignition[946]: INFO : Stage: umount Sep 4 17:52:10.772058 ignition[946]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 17:52:10.772058 ignition[946]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Sep 4 17:52:10.772058 ignition[946]: INFO : umount: umount passed Sep 4 17:52:10.772058 ignition[946]: INFO : Ignition finished successfully Sep 4 17:52:10.771920 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 17:52:10.772495 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 17:52:10.774677 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 17:52:10.775467 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 17:52:10.775542 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 17:52:10.776627 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 17:52:10.776673 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 17:52:10.777176 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 4 17:52:10.777216 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 4 17:52:10.779013 systemd[1]: Stopped target network.target - Network. Sep 4 17:52:10.780944 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 17:52:10.781013 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 17:52:10.781952 systemd[1]: Stopped target paths.target - Path Units. Sep 4 17:52:10.782847 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 17:52:10.784313 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 17:52:10.784889 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 17:52:10.785808 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 17:52:10.786921 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 17:52:10.786960 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 17:52:10.787980 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 17:52:10.788014 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 17:52:10.788913 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 17:52:10.788954 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 17:52:10.789980 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 17:52:10.790022 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 17:52:10.791431 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 17:52:10.792874 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 17:52:10.794006 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 17:52:10.794086 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 17:52:10.795127 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 17:52:10.795197 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 17:52:10.796665 systemd-networkd[704]: eth0: DHCPv6 lease lost Sep 4 17:52:10.798233 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 17:52:10.798398 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 17:52:10.799383 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 17:52:10.799431 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 17:52:10.808440 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 17:52:10.809456 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 17:52:10.809516 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 17:52:10.810139 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 17:52:10.811056 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 17:52:10.811170 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 17:52:10.825643 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 17:52:10.825801 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 17:52:10.833091 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 17:52:10.833199 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 17:52:10.835687 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 17:52:10.835733 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 17:52:10.836416 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 17:52:10.836449 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 17:52:10.837487 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 17:52:10.837533 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 17:52:10.839072 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 17:52:10.839114 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 17:52:10.840217 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 17:52:10.840276 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 17:52:10.851434 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 17:52:10.852596 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 17:52:10.852651 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 17:52:10.853754 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 17:52:10.853796 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 17:52:10.854930 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 17:52:10.854972 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 17:52:10.856152 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 17:52:10.856192 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 17:52:10.857409 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 17:52:10.857467 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:52:10.859026 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 17:52:10.859123 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 17:52:10.860205 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 17:52:10.866667 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 17:52:10.873712 systemd[1]: Switching root. Sep 4 17:52:10.907919 systemd-journald[184]: Journal stopped Sep 4 17:52:12.831695 systemd-journald[184]: Received SIGTERM from PID 1 (systemd). Sep 4 17:52:12.831770 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 17:52:12.831785 kernel: SELinux: policy capability open_perms=1 Sep 4 17:52:12.831797 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 17:52:12.831814 kernel: SELinux: policy capability always_check_network=0 Sep 4 17:52:12.831826 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 17:52:12.831845 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 17:52:12.831857 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 17:52:12.831870 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 17:52:12.831882 kernel: audit: type=1403 audit(1725472331.776:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 17:52:12.831895 systemd[1]: Successfully loaded SELinux policy in 73.132ms. Sep 4 17:52:12.831909 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.455ms. Sep 4 17:52:12.831926 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 4 17:52:12.831939 systemd[1]: Detected virtualization kvm. Sep 4 17:52:12.831955 systemd[1]: Detected architecture x86-64. Sep 4 17:52:12.831968 systemd[1]: Detected first boot. Sep 4 17:52:12.831980 systemd[1]: Hostname set to . Sep 4 17:52:12.831993 systemd[1]: Initializing machine ID from VM UUID. Sep 4 17:52:12.832005 zram_generator::config[988]: No configuration found. Sep 4 17:52:12.832024 systemd[1]: Populated /etc with preset unit settings. Sep 4 17:52:12.832037 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 17:52:12.832049 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 17:52:12.832064 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 17:52:12.832077 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 17:52:12.832090 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 17:52:12.832102 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 17:52:12.832115 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 17:52:12.832127 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 17:52:12.832144 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 17:52:12.832157 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 17:52:12.832169 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 17:52:12.832184 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 17:52:12.832197 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 17:52:12.832210 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 17:52:12.832223 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 17:52:12.832236 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 17:52:12.832249 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 17:52:12.832839 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 4 17:52:12.832856 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 17:52:12.832868 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 17:52:12.832884 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 17:52:12.832896 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 17:52:12.832908 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 17:52:12.832920 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 17:52:12.832932 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 17:52:12.832944 systemd[1]: Reached target slices.target - Slice Units. Sep 4 17:52:12.832957 systemd[1]: Reached target swap.target - Swaps. Sep 4 17:52:12.832969 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 17:52:12.832981 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 17:52:12.832993 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 17:52:12.833005 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 17:52:12.833019 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 17:52:12.833031 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 17:52:12.833043 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 17:52:12.833055 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 17:52:12.833068 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 17:52:12.833081 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:52:12.833092 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 17:52:12.833104 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 17:52:12.833116 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 17:52:12.833128 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 17:52:12.833140 systemd[1]: Reached target machines.target - Containers. Sep 4 17:52:12.833152 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 17:52:12.833164 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 17:52:12.833178 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 17:52:12.833190 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 17:52:12.833202 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 17:52:12.833214 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 17:52:12.833226 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 17:52:12.833237 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 17:52:12.833249 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 17:52:12.834044 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 17:52:12.834065 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 17:52:12.834080 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 17:52:12.834092 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 17:52:12.834104 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 17:52:12.834116 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 17:52:12.834127 kernel: loop: module loaded Sep 4 17:52:12.834139 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 17:52:12.834151 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 17:52:12.834163 kernel: fuse: init (API version 7.39) Sep 4 17:52:12.834190 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 17:52:12.834203 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 17:52:12.834215 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 17:52:12.834227 systemd[1]: Stopped verity-setup.service. Sep 4 17:52:12.834239 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:52:12.834251 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 17:52:12.834330 systemd-journald[1080]: Collecting audit messages is disabled. Sep 4 17:52:12.834355 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 17:52:12.834370 kernel: ACPI: bus type drm_connector registered Sep 4 17:52:12.834382 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 17:52:12.834394 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 17:52:12.834406 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 17:52:12.834420 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 17:52:12.834433 systemd-journald[1080]: Journal started Sep 4 17:52:12.834458 systemd-journald[1080]: Runtime Journal (/run/log/journal/fe0e33dedfb845d2b36d2f8bf276f3fe) is 4.9M, max 39.3M, 34.4M free. Sep 4 17:52:12.479207 systemd[1]: Queued start job for default target multi-user.target. Sep 4 17:52:12.509540 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 4 17:52:12.509961 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 17:52:12.838325 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 17:52:12.840291 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 17:52:12.840906 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 17:52:12.841074 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 17:52:12.841811 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 17:52:12.842307 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 17:52:12.843054 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 17:52:12.843186 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 17:52:12.845192 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 17:52:12.845974 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 17:52:12.846109 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 17:52:12.847191 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 17:52:12.847486 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 17:52:12.848484 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 17:52:12.848656 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 17:52:12.849764 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 17:52:12.850803 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 17:52:12.851784 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 17:52:12.862985 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 17:52:12.869057 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 17:52:12.872947 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 17:52:12.873616 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 17:52:12.873711 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 17:52:12.875429 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 4 17:52:12.881304 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 17:52:12.883394 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 17:52:12.885083 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 17:52:12.892450 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 17:52:12.903442 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 17:52:12.904061 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 17:52:12.911452 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 17:52:12.912029 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 17:52:12.916522 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 17:52:12.923489 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 17:52:12.926147 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 17:52:12.934219 systemd-journald[1080]: Time spent on flushing to /var/log/journal/fe0e33dedfb845d2b36d2f8bf276f3fe is 69.409ms for 933 entries. Sep 4 17:52:12.934219 systemd-journald[1080]: System Journal (/var/log/journal/fe0e33dedfb845d2b36d2f8bf276f3fe) is 8.0M, max 584.8M, 576.8M free. Sep 4 17:52:13.070238 systemd-journald[1080]: Received client request to flush runtime journal. Sep 4 17:52:13.070331 kernel: loop0: detected capacity change from 0 to 89336 Sep 4 17:52:13.070358 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 17:52:12.929561 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 17:52:12.930375 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 17:52:12.931404 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 17:52:12.932553 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 17:52:12.946501 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 4 17:52:12.956932 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 17:52:12.957898 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 17:52:12.961510 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 4 17:52:13.008876 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 17:52:13.011893 udevadm[1126]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Sep 4 17:52:13.074548 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 17:52:13.086483 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 17:52:13.088574 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 4 17:52:13.092287 kernel: loop1: detected capacity change from 0 to 8 Sep 4 17:52:13.093447 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 17:52:13.102304 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 17:52:13.120303 kernel: loop2: detected capacity change from 0 to 209816 Sep 4 17:52:13.159489 systemd-tmpfiles[1141]: ACLs are not supported, ignoring. Sep 4 17:52:13.159512 systemd-tmpfiles[1141]: ACLs are not supported, ignoring. Sep 4 17:52:13.164950 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 17:52:13.186299 kernel: loop3: detected capacity change from 0 to 140728 Sep 4 17:52:13.258282 kernel: loop4: detected capacity change from 0 to 89336 Sep 4 17:52:13.295299 kernel: loop5: detected capacity change from 0 to 8 Sep 4 17:52:13.307319 kernel: loop6: detected capacity change from 0 to 209816 Sep 4 17:52:13.361796 kernel: loop7: detected capacity change from 0 to 140728 Sep 4 17:52:13.421008 (sd-merge)[1146]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Sep 4 17:52:13.422719 (sd-merge)[1146]: Merged extensions into '/usr'. Sep 4 17:52:13.426418 systemd[1]: Reloading requested from client PID 1120 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 17:52:13.426435 systemd[1]: Reloading... Sep 4 17:52:13.511712 zram_generator::config[1167]: No configuration found. Sep 4 17:52:13.753325 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 17:52:13.854796 systemd[1]: Reloading finished in 427 ms. Sep 4 17:52:13.886888 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 17:52:13.888731 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 17:52:13.897779 systemd[1]: Starting ensure-sysext.service... Sep 4 17:52:13.899661 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 17:52:13.910485 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 17:52:13.915565 systemd[1]: Reloading requested from client PID 1226 ('systemctl') (unit ensure-sysext.service)... Sep 4 17:52:13.915586 systemd[1]: Reloading... Sep 4 17:52:13.916999 ldconfig[1115]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 17:52:13.947448 systemd-tmpfiles[1227]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 17:52:13.947863 systemd-tmpfiles[1227]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 17:52:13.948803 systemd-tmpfiles[1227]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 17:52:13.949142 systemd-tmpfiles[1227]: ACLs are not supported, ignoring. Sep 4 17:52:13.949217 systemd-tmpfiles[1227]: ACLs are not supported, ignoring. Sep 4 17:52:13.953470 systemd-tmpfiles[1227]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 17:52:13.953483 systemd-tmpfiles[1227]: Skipping /boot Sep 4 17:52:13.958661 systemd-udevd[1228]: Using default interface naming scheme 'v255'. Sep 4 17:52:13.969704 systemd-tmpfiles[1227]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 17:52:13.969721 systemd-tmpfiles[1227]: Skipping /boot Sep 4 17:52:14.021306 zram_generator::config[1255]: No configuration found. Sep 4 17:52:14.064553 kernel: BTRFS info: devid 1 device path /dev/mapper/usr changed to /dev/dm-0 scanned by (udev-worker) (1260) Sep 4 17:52:14.078352 kernel: BTRFS info: devid 1 device path /dev/dm-0 changed to /dev/mapper/usr scanned by (udev-worker) (1260) Sep 4 17:52:14.114860 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1272) Sep 4 17:52:14.173330 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Sep 4 17:52:14.202286 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Sep 4 17:52:14.212285 kernel: ACPI: button: Power Button [PWRF] Sep 4 17:52:14.268291 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 4 17:52:14.285346 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 17:52:14.285400 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 17:52:14.299134 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Sep 4 17:52:14.299189 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Sep 4 17:52:14.306375 kernel: Console: switching to colour dummy device 80x25 Sep 4 17:52:14.306454 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 4 17:52:14.306475 kernel: [drm] features: -context_init Sep 4 17:52:14.307564 kernel: [drm] number of scanouts: 1 Sep 4 17:52:14.308320 kernel: [drm] number of cap sets: 0 Sep 4 17:52:14.312275 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Sep 4 17:52:14.317285 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Sep 4 17:52:14.319282 kernel: Console: switching to colour frame buffer device 128x48 Sep 4 17:52:14.323303 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 4 17:52:14.368597 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 17:52:14.370379 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 4 17:52:14.370510 systemd[1]: Reloading finished in 454 ms. Sep 4 17:52:14.385815 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 17:52:14.387789 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 17:52:14.388124 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 17:52:14.433086 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:52:14.444595 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 4 17:52:14.450102 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 17:52:14.450723 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 17:52:14.458545 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 17:52:14.462149 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 17:52:14.468407 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 17:52:14.469893 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 17:52:14.477246 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 17:52:14.487139 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 17:52:14.490667 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 17:52:14.514288 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 17:52:14.528828 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 17:52:14.533940 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:52:14.536577 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:52:14.542871 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 17:52:14.543081 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 17:52:14.546789 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 17:52:14.547985 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 17:52:14.549603 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 17:52:14.550381 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 17:52:14.554006 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 17:52:14.570410 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:52:14.570847 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 17:52:14.578547 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 17:52:14.582799 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 17:52:14.593556 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 17:52:14.598874 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 17:52:14.601512 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 17:52:14.623114 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 17:52:14.624019 augenrules[1377]: No rules Sep 4 17:52:14.626759 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:52:14.633287 systemd[1]: Finished ensure-sysext.service. Sep 4 17:52:14.639246 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 4 17:52:14.644673 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 17:52:14.645981 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 17:52:14.646168 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:52:14.650720 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 17:52:14.650910 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 17:52:14.652089 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 17:52:14.652241 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 17:52:14.653495 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 17:52:14.653650 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 17:52:14.657131 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 17:52:14.657300 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 17:52:14.668808 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 17:52:14.679012 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 4 17:52:14.679987 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 17:52:14.688075 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 17:52:14.720497 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 4 17:52:14.722222 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 17:52:14.722334 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 17:52:14.732451 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 17:52:14.741543 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 17:52:14.752574 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:52:14.754248 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 17:52:14.759744 lvm[1399]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 4 17:52:14.794239 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 17:52:14.799031 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 4 17:52:14.805478 systemd-resolved[1352]: Positive Trust Anchors: Sep 4 17:52:14.805495 systemd-resolved[1352]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 17:52:14.805538 systemd-resolved[1352]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 17:52:14.808992 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 17:52:14.815689 systemd-resolved[1352]: Using system hostname 'ci-4054-1-0-c-33e05803e0.novalocal'. Sep 4 17:52:14.825506 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 4 17:52:14.826251 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 17:52:14.826968 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 17:52:14.843719 lvm[1406]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 4 17:52:14.855385 systemd-networkd[1348]: lo: Link UP Sep 4 17:52:14.855396 systemd-networkd[1348]: lo: Gained carrier Sep 4 17:52:14.858637 systemd-networkd[1348]: Enumeration completed Sep 4 17:52:14.859147 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 17:52:14.860981 systemd[1]: Reached target network.target - Network. Sep 4 17:52:14.864717 systemd-networkd[1348]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:52:14.864730 systemd-networkd[1348]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 17:52:14.867421 systemd-networkd[1348]: eth0: Link UP Sep 4 17:52:14.867432 systemd-networkd[1348]: eth0: Gained carrier Sep 4 17:52:14.867450 systemd-networkd[1348]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:52:14.870455 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 17:52:14.875970 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 4 17:52:14.883545 systemd-networkd[1348]: eth0: DHCPv4 address 172.24.4.122/24, gateway 172.24.4.1 acquired from 172.24.4.1 Sep 4 17:52:14.890980 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 17:52:14.891849 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 17:52:14.911615 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:52:14.914244 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 17:52:14.919082 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 17:52:14.923596 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 17:52:14.931243 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 17:52:14.933671 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 17:52:14.936924 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 17:52:14.937804 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 17:52:14.937852 systemd[1]: Reached target paths.target - Path Units. Sep 4 17:52:14.940670 systemd[1]: Reached target timers.target - Timer Units. Sep 4 17:52:14.943860 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 17:52:14.947178 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 17:52:14.954488 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 17:52:14.958025 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 17:52:14.961648 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 17:52:14.964964 systemd[1]: Reached target basic.target - Basic System. Sep 4 17:52:14.968176 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 17:52:14.968312 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 17:52:14.973361 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 17:52:14.984876 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 4 17:52:15.004554 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 17:52:15.023442 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 17:52:15.029413 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 17:52:15.032155 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 17:52:15.039284 jq[1422]: false Sep 4 17:52:15.043053 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 17:52:15.047975 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 17:52:15.055542 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 17:52:15.060446 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 17:52:15.076336 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 17:52:15.077657 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 4 17:52:15.080336 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 17:52:15.083388 extend-filesystems[1423]: Found loop4 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found loop5 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found loop6 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found loop7 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda1 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda2 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda3 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found usr Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda4 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda6 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda7 Sep 4 17:52:15.083388 extend-filesystems[1423]: Found vda9 Sep 4 17:52:15.083388 extend-filesystems[1423]: Checking size of /dev/vda9 Sep 4 17:52:15.091138 dbus-daemon[1421]: [system] SELinux support is enabled Sep 4 17:52:15.083473 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 17:52:15.103090 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 17:52:15.115854 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 17:52:16.327360 jq[1432]: true Sep 4 17:52:15.140941 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 17:52:16.310078 systemd-timesyncd[1400]: Contacted time server 129.250.35.250:123 (0.flatcar.pool.ntp.org). Sep 4 17:52:16.310147 systemd-timesyncd[1400]: Initial clock synchronization to Wed 2024-09-04 17:52:16.309834 UTC. Sep 4 17:52:16.310271 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 17:52:16.310399 systemd-resolved[1352]: Clock change detected. Flushing caches. Sep 4 17:52:16.329473 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 17:52:16.329551 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 17:52:16.340143 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 17:52:16.340189 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 17:52:16.343951 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 17:52:16.345342 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 17:52:16.369848 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 17:52:16.371873 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 17:52:16.375421 jq[1447]: true Sep 4 17:52:16.380681 extend-filesystems[1423]: Resized partition /dev/vda9 Sep 4 17:52:16.383247 update_engine[1430]: I0904 17:52:16.381275 1430 main.cc:92] Flatcar Update Engine starting Sep 4 17:52:16.385901 (ntainerd)[1449]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 17:52:16.396017 systemd[1]: Started update-engine.service - Update Engine. Sep 4 17:52:16.398144 extend-filesystems[1459]: resize2fs 1.47.1 (20-May-2024) Sep 4 17:52:16.411006 update_engine[1430]: I0904 17:52:16.396402 1430 update_check_scheduler.cc:74] Next update check in 5m10s Sep 4 17:52:16.403359 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 17:52:16.411182 tar[1440]: linux-amd64/helm Sep 4 17:52:16.424172 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 17:52:16.434338 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 4635643 blocks Sep 4 17:52:16.475515 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1270) Sep 4 17:52:16.514277 systemd-logind[1429]: New seat seat0. Sep 4 17:52:16.526759 systemd-logind[1429]: Watching system buttons on /dev/input/event1 (Power Button) Sep 4 17:52:16.526840 systemd-logind[1429]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 4 17:52:16.527204 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 17:52:16.639505 kernel: EXT4-fs (vda9): resized filesystem to 4635643 Sep 4 17:52:16.677170 locksmithd[1461]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 17:52:16.763752 bash[1476]: Updated "/home/core/.ssh/authorized_keys" Sep 4 17:52:16.764646 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 17:52:16.767223 extend-filesystems[1459]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 4 17:52:16.767223 extend-filesystems[1459]: old_desc_blocks = 1, new_desc_blocks = 3 Sep 4 17:52:16.767223 extend-filesystems[1459]: The filesystem on /dev/vda9 is now 4635643 (4k) blocks long. Sep 4 17:52:16.777582 extend-filesystems[1423]: Resized filesystem in /dev/vda9 Sep 4 17:52:16.769748 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 17:52:16.772383 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 17:52:16.791415 sshd_keygen[1448]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 17:52:16.795319 systemd[1]: Starting sshkeys.service... Sep 4 17:52:16.826898 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 4 17:52:16.836653 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 4 17:52:16.845939 containerd[1449]: time="2024-09-04T17:52:16.845757445Z" level=info msg="starting containerd" revision=8ccfc03e4e2b73c22899202ae09d0caf906d3863 version=v1.7.20 Sep 4 17:52:16.872912 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 17:52:16.890187 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 17:52:16.895244 systemd[1]: Started sshd@0-172.24.4.122:22-172.24.4.1:49396.service - OpenSSH per-connection server daemon (172.24.4.1:49396). Sep 4 17:52:16.915909 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 17:52:16.916104 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 17:52:16.935226 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 17:52:16.942963 containerd[1449]: time="2024-09-04T17:52:16.942860582Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 4 17:52:16.948907 containerd[1449]: time="2024-09-04T17:52:16.948665369Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.48-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 4 17:52:16.948907 containerd[1449]: time="2024-09-04T17:52:16.948714892Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 4 17:52:16.948907 containerd[1449]: time="2024-09-04T17:52:16.948737945Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 4 17:52:16.949046 containerd[1449]: time="2024-09-04T17:52:16.948958929Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 4 17:52:16.949046 containerd[1449]: time="2024-09-04T17:52:16.948985860Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 4 17:52:16.949108 containerd[1449]: time="2024-09-04T17:52:16.949076670Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 4 17:52:16.949108 containerd[1449]: time="2024-09-04T17:52:16.949094073Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950012335Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950040368Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950058201Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950071105Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950163699Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950434407Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950899068Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.950918004Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.951013192Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 4 17:52:16.952513 containerd[1449]: time="2024-09-04T17:52:16.951075970Z" level=info msg="metadata content store policy set" policy=shared Sep 4 17:52:16.962103 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 17:52:16.973683 containerd[1449]: time="2024-09-04T17:52:16.973610974Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 4 17:52:16.974974 containerd[1449]: time="2024-09-04T17:52:16.974909510Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 4 17:52:16.975014 containerd[1449]: time="2024-09-04T17:52:16.974998537Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 4 17:52:16.975121 containerd[1449]: time="2024-09-04T17:52:16.975073868Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 4 17:52:16.975195 containerd[1449]: time="2024-09-04T17:52:16.975166051Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 4 17:52:16.976132 containerd[1449]: time="2024-09-04T17:52:16.976088641Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 4 17:52:16.977237 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 17:52:16.984256 containerd[1449]: time="2024-09-04T17:52:16.984187591Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 4 17:52:16.986082 containerd[1449]: time="2024-09-04T17:52:16.986012974Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 4 17:52:16.986139 containerd[1449]: time="2024-09-04T17:52:16.986086552Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 4 17:52:16.986165 containerd[1449]: time="2024-09-04T17:52:16.986118202Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 4 17:52:16.986222 containerd[1449]: time="2024-09-04T17:52:16.986189816Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986286 containerd[1449]: time="2024-09-04T17:52:16.986256230Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986342 containerd[1449]: time="2024-09-04T17:52:16.986293961Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986397 containerd[1449]: time="2024-09-04T17:52:16.986367399Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986460 containerd[1449]: time="2024-09-04T17:52:16.986429856Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986487 containerd[1449]: time="2024-09-04T17:52:16.986469651Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986511 containerd[1449]: time="2024-09-04T17:52:16.986496862Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986540 containerd[1449]: time="2024-09-04T17:52:16.986525596Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 4 17:52:16.986601 containerd[1449]: time="2024-09-04T17:52:16.986570750Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.986630 containerd[1449]: time="2024-09-04T17:52:16.986613330Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.986658 containerd[1449]: time="2024-09-04T17:52:16.986640531Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.987816 containerd[1449]: time="2024-09-04T17:52:16.986672211Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.987816 containerd[1449]: time="2024-09-04T17:52:16.986719980Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.987816 containerd[1449]: time="2024-09-04T17:52:16.986748985Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.987911 containerd[1449]: time="2024-09-04T17:52:16.986776416Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.988059 containerd[1449]: time="2024-09-04T17:52:16.987904051Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.988059 containerd[1449]: time="2024-09-04T17:52:16.987995522Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988073549Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988105749Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988168186Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988199365Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988269556Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988355227Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988387227Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988438583Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988568888Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988950634Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.988983866Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 4 17:52:16.991748 containerd[1449]: time="2024-09-04T17:52:16.989150489Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 4 17:52:16.990597 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 4 17:52:16.993712 containerd[1449]: time="2024-09-04T17:52:16.989173482Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.993712 containerd[1449]: time="2024-09-04T17:52:16.992851430Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 4 17:52:16.993712 containerd[1449]: time="2024-09-04T17:52:16.992920760Z" level=info msg="NRI interface is disabled by configuration." Sep 4 17:52:16.993522 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 17:52:16.996000 containerd[1449]: time="2024-09-04T17:52:16.992944615Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 4 17:52:16.996769 containerd[1449]: time="2024-09-04T17:52:16.996643802Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 4 17:52:17.001255 containerd[1449]: time="2024-09-04T17:52:17.000854359Z" level=info msg="Connect containerd service" Sep 4 17:52:17.001777 containerd[1449]: time="2024-09-04T17:52:17.001737245Z" level=info msg="using legacy CRI server" Sep 4 17:52:17.001832 containerd[1449]: time="2024-09-04T17:52:17.001772682Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 17:52:17.002094 containerd[1449]: time="2024-09-04T17:52:17.002057746Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 4 17:52:17.003269 containerd[1449]: time="2024-09-04T17:52:17.003217912Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 17:52:17.003523 containerd[1449]: time="2024-09-04T17:52:17.003407488Z" level=info msg="Start subscribing containerd event" Sep 4 17:52:17.003523 containerd[1449]: time="2024-09-04T17:52:17.003482448Z" level=info msg="Start recovering state" Sep 4 17:52:17.003909 containerd[1449]: time="2024-09-04T17:52:17.003597254Z" level=info msg="Start event monitor" Sep 4 17:52:17.003909 containerd[1449]: time="2024-09-04T17:52:17.003630857Z" level=info msg="Start snapshots syncer" Sep 4 17:52:17.003909 containerd[1449]: time="2024-09-04T17:52:17.003647838Z" level=info msg="Start cni network conf syncer for default" Sep 4 17:52:17.003909 containerd[1449]: time="2024-09-04T17:52:17.003663257Z" level=info msg="Start streaming server" Sep 4 17:52:17.004865 containerd[1449]: time="2024-09-04T17:52:17.004828743Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 17:52:17.005028 containerd[1449]: time="2024-09-04T17:52:17.004997780Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 17:52:17.005253 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 17:52:17.007080 containerd[1449]: time="2024-09-04T17:52:17.006624942Z" level=info msg="containerd successfully booted in 0.167882s" Sep 4 17:52:17.311315 tar[1440]: linux-amd64/LICENSE Sep 4 17:52:17.311532 tar[1440]: linux-amd64/README.md Sep 4 17:52:17.325468 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 17:52:17.641175 systemd-networkd[1348]: eth0: Gained IPv6LL Sep 4 17:52:17.647064 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 17:52:17.652371 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 17:52:17.689355 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:52:17.706997 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 17:52:17.764550 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 17:52:17.892054 sshd[1505]: Accepted publickey for core from 172.24.4.1 port 49396 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:17.898650 sshd[1505]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:17.922580 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 17:52:17.933148 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 17:52:17.941062 systemd-logind[1429]: New session 1 of user core. Sep 4 17:52:17.961097 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 17:52:17.976178 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 17:52:17.983586 (systemd)[1534]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 17:52:18.111248 systemd[1534]: Queued start job for default target default.target. Sep 4 17:52:18.114824 systemd[1534]: Created slice app.slice - User Application Slice. Sep 4 17:52:18.114852 systemd[1534]: Reached target paths.target - Paths. Sep 4 17:52:18.114867 systemd[1534]: Reached target timers.target - Timers. Sep 4 17:52:18.117062 systemd[1534]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 17:52:18.161191 systemd[1534]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 17:52:18.161331 systemd[1534]: Reached target sockets.target - Sockets. Sep 4 17:52:18.161360 systemd[1534]: Reached target basic.target - Basic System. Sep 4 17:52:18.161402 systemd[1534]: Reached target default.target - Main User Target. Sep 4 17:52:18.161438 systemd[1534]: Startup finished in 170ms. Sep 4 17:52:18.161513 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 17:52:18.170080 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 17:52:18.696046 systemd[1]: Started sshd@1-172.24.4.122:22-172.24.4.1:49400.service - OpenSSH per-connection server daemon (172.24.4.1:49400). Sep 4 17:52:19.403150 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:52:19.418467 (kubelet)[1551]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 17:52:20.476269 sshd[1545]: Accepted publickey for core from 172.24.4.1 port 49400 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:20.478996 sshd[1545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:20.491359 systemd-logind[1429]: New session 2 of user core. Sep 4 17:52:20.499293 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 17:52:20.769712 kubelet[1551]: E0904 17:52:20.769461 1551 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 17:52:20.772171 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 17:52:20.772493 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 17:52:20.773307 systemd[1]: kubelet.service: Consumed 1.812s CPU time. Sep 4 17:52:21.148449 sshd[1545]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:21.158270 systemd[1]: sshd@1-172.24.4.122:22-172.24.4.1:49400.service: Deactivated successfully. Sep 4 17:52:21.160560 systemd[1]: session-2.scope: Deactivated successfully. Sep 4 17:52:21.162561 systemd-logind[1429]: Session 2 logged out. Waiting for processes to exit. Sep 4 17:52:21.169293 systemd[1]: Started sshd@2-172.24.4.122:22-172.24.4.1:49406.service - OpenSSH per-connection server daemon (172.24.4.1:49406). Sep 4 17:52:21.173030 systemd-logind[1429]: Removed session 2. Sep 4 17:52:22.054716 login[1514]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 17:52:22.065466 login[1515]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 4 17:52:22.070120 systemd-logind[1429]: New session 3 of user core. Sep 4 17:52:22.078407 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 17:52:22.085681 systemd-logind[1429]: New session 4 of user core. Sep 4 17:52:22.101606 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 17:52:22.360325 sshd[1567]: Accepted publickey for core from 172.24.4.1 port 49406 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:22.364015 sshd[1567]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:22.373972 systemd-logind[1429]: New session 5 of user core. Sep 4 17:52:22.386510 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 17:52:23.010052 sshd[1567]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:23.016004 systemd[1]: sshd@2-172.24.4.122:22-172.24.4.1:49406.service: Deactivated successfully. Sep 4 17:52:23.020153 systemd[1]: session-5.scope: Deactivated successfully. Sep 4 17:52:23.023264 systemd-logind[1429]: Session 5 logged out. Waiting for processes to exit. Sep 4 17:52:23.027016 systemd-logind[1429]: Removed session 5. Sep 4 17:52:23.241497 coreos-metadata[1418]: Sep 04 17:52:23.241 WARN failed to locate config-drive, using the metadata service API instead Sep 4 17:52:23.316425 coreos-metadata[1418]: Sep 04 17:52:23.316 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Sep 4 17:52:23.527137 coreos-metadata[1418]: Sep 04 17:52:23.527 INFO Fetch successful Sep 4 17:52:23.527137 coreos-metadata[1418]: Sep 04 17:52:23.527 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Sep 4 17:52:23.542295 coreos-metadata[1418]: Sep 04 17:52:23.542 INFO Fetch successful Sep 4 17:52:23.542295 coreos-metadata[1418]: Sep 04 17:52:23.542 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Sep 4 17:52:23.557172 coreos-metadata[1418]: Sep 04 17:52:23.557 INFO Fetch successful Sep 4 17:52:23.557172 coreos-metadata[1418]: Sep 04 17:52:23.557 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Sep 4 17:52:23.573146 coreos-metadata[1418]: Sep 04 17:52:23.572 INFO Fetch successful Sep 4 17:52:23.573355 coreos-metadata[1418]: Sep 04 17:52:23.573 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Sep 4 17:52:23.590031 coreos-metadata[1418]: Sep 04 17:52:23.589 INFO Fetch successful Sep 4 17:52:23.590031 coreos-metadata[1418]: Sep 04 17:52:23.589 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Sep 4 17:52:23.605949 coreos-metadata[1418]: Sep 04 17:52:23.605 INFO Fetch successful Sep 4 17:52:23.649105 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 4 17:52:23.650550 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 4 17:52:23.971614 coreos-metadata[1498]: Sep 04 17:52:23.971 WARN failed to locate config-drive, using the metadata service API instead Sep 4 17:52:24.018564 coreos-metadata[1498]: Sep 04 17:52:24.018 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Sep 4 17:52:24.036665 coreos-metadata[1498]: Sep 04 17:52:24.036 INFO Fetch successful Sep 4 17:52:24.036665 coreos-metadata[1498]: Sep 04 17:52:24.036 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Sep 4 17:52:24.052594 coreos-metadata[1498]: Sep 04 17:52:24.052 INFO Fetch successful Sep 4 17:52:24.057097 unknown[1498]: wrote ssh authorized keys file for user: core Sep 4 17:52:24.094729 update-ssh-keys[1609]: Updated "/home/core/.ssh/authorized_keys" Sep 4 17:52:24.097632 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 4 17:52:24.100901 systemd[1]: Finished sshkeys.service. Sep 4 17:52:24.105248 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 17:52:24.105382 systemd[1]: Startup finished in 1.063s (kernel) + 14.969s (initrd) + 11.232s (userspace) = 27.265s. Sep 4 17:52:31.023435 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 17:52:31.037223 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:52:31.378037 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:52:31.379234 (kubelet)[1621]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 17:52:31.657370 kubelet[1621]: E0904 17:52:31.657146 1621 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 17:52:31.665862 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 17:52:31.666192 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 17:52:33.039991 systemd[1]: Started sshd@3-172.24.4.122:22-172.24.4.1:54280.service - OpenSSH per-connection server daemon (172.24.4.1:54280). Sep 4 17:52:34.583650 sshd[1630]: Accepted publickey for core from 172.24.4.1 port 54280 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:34.586711 sshd[1630]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:34.599296 systemd-logind[1429]: New session 6 of user core. Sep 4 17:52:34.607306 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 17:52:35.233254 sshd[1630]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:35.247629 systemd[1]: sshd@3-172.24.4.122:22-172.24.4.1:54280.service: Deactivated successfully. Sep 4 17:52:35.251194 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 17:52:35.255179 systemd-logind[1429]: Session 6 logged out. Waiting for processes to exit. Sep 4 17:52:35.261341 systemd[1]: Started sshd@4-172.24.4.122:22-172.24.4.1:53894.service - OpenSSH per-connection server daemon (172.24.4.1:53894). Sep 4 17:52:35.264268 systemd-logind[1429]: Removed session 6. Sep 4 17:52:36.910367 sshd[1637]: Accepted publickey for core from 172.24.4.1 port 53894 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:36.914009 sshd[1637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:36.925424 systemd-logind[1429]: New session 7 of user core. Sep 4 17:52:36.934085 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 17:52:37.562439 sshd[1637]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:37.573779 systemd[1]: sshd@4-172.24.4.122:22-172.24.4.1:53894.service: Deactivated successfully. Sep 4 17:52:37.578393 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 17:52:37.580454 systemd-logind[1429]: Session 7 logged out. Waiting for processes to exit. Sep 4 17:52:37.588389 systemd[1]: Started sshd@5-172.24.4.122:22-172.24.4.1:53910.service - OpenSSH per-connection server daemon (172.24.4.1:53910). Sep 4 17:52:37.592317 systemd-logind[1429]: Removed session 7. Sep 4 17:52:39.108562 sshd[1644]: Accepted publickey for core from 172.24.4.1 port 53910 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:39.111627 sshd[1644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:39.121595 systemd-logind[1429]: New session 8 of user core. Sep 4 17:52:39.134114 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 17:52:40.070348 sshd[1644]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:40.085189 systemd[1]: sshd@5-172.24.4.122:22-172.24.4.1:53910.service: Deactivated successfully. Sep 4 17:52:40.089135 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 17:52:40.093139 systemd-logind[1429]: Session 8 logged out. Waiting for processes to exit. Sep 4 17:52:40.100382 systemd[1]: Started sshd@6-172.24.4.122:22-172.24.4.1:53912.service - OpenSSH per-connection server daemon (172.24.4.1:53912). Sep 4 17:52:40.103553 systemd-logind[1429]: Removed session 8. Sep 4 17:52:41.470044 sshd[1651]: Accepted publickey for core from 172.24.4.1 port 53912 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:41.473379 sshd[1651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:41.483516 systemd-logind[1429]: New session 9 of user core. Sep 4 17:52:41.495122 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 17:52:41.680665 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 4 17:52:41.689317 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:52:42.181466 sudo[1657]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 17:52:42.182305 sudo[1657]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:52:42.195162 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:52:42.208642 (kubelet)[1664]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 17:52:42.211876 sudo[1657]: pam_unix(sudo:session): session closed for user root Sep 4 17:52:42.305014 kubelet[1664]: E0904 17:52:42.304905 1664 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 17:52:42.310064 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 17:52:42.310243 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 17:52:42.429089 sshd[1651]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:42.440030 systemd[1]: sshd@6-172.24.4.122:22-172.24.4.1:53912.service: Deactivated successfully. Sep 4 17:52:42.443369 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 17:52:42.446876 systemd-logind[1429]: Session 9 logged out. Waiting for processes to exit. Sep 4 17:52:42.453544 systemd[1]: Started sshd@7-172.24.4.122:22-172.24.4.1:53918.service - OpenSSH per-connection server daemon (172.24.4.1:53918). Sep 4 17:52:42.456985 systemd-logind[1429]: Removed session 9. Sep 4 17:52:43.815418 sshd[1675]: Accepted publickey for core from 172.24.4.1 port 53918 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:43.818686 sshd[1675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:43.831472 systemd-logind[1429]: New session 10 of user core. Sep 4 17:52:43.837148 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 17:52:44.315020 sudo[1679]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 17:52:44.315736 sudo[1679]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:52:44.324420 sudo[1679]: pam_unix(sudo:session): session closed for user root Sep 4 17:52:44.336189 sudo[1678]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 4 17:52:44.337026 sudo[1678]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:52:44.365489 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 4 17:52:44.371649 auditctl[1682]: No rules Sep 4 17:52:44.372379 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 17:52:44.372858 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 4 17:52:44.384592 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 4 17:52:44.446221 augenrules[1700]: No rules Sep 4 17:52:44.448256 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 4 17:52:44.450273 sudo[1678]: pam_unix(sudo:session): session closed for user root Sep 4 17:52:44.641258 sshd[1675]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:44.653865 systemd[1]: sshd@7-172.24.4.122:22-172.24.4.1:53918.service: Deactivated successfully. Sep 4 17:52:44.657348 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 17:52:44.661181 systemd-logind[1429]: Session 10 logged out. Waiting for processes to exit. Sep 4 17:52:44.673488 systemd[1]: Started sshd@8-172.24.4.122:22-172.24.4.1:56542.service - OpenSSH per-connection server daemon (172.24.4.1:56542). Sep 4 17:52:44.677029 systemd-logind[1429]: Removed session 10. Sep 4 17:52:45.732890 sshd[1708]: Accepted publickey for core from 172.24.4.1 port 56542 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:52:45.736259 sshd[1708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:45.749417 systemd-logind[1429]: New session 11 of user core. Sep 4 17:52:45.765609 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 17:52:46.232413 sudo[1711]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 17:52:46.234031 sudo[1711]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:52:46.593569 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 17:52:46.612476 (dockerd)[1720]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 17:52:47.343240 dockerd[1720]: time="2024-09-04T17:52:47.342893776Z" level=info msg="Starting up" Sep 4 17:52:47.613087 dockerd[1720]: time="2024-09-04T17:52:47.612978370Z" level=info msg="Loading containers: start." Sep 4 17:52:47.853883 kernel: Initializing XFRM netlink socket Sep 4 17:52:48.015245 systemd-networkd[1348]: docker0: Link UP Sep 4 17:52:48.039900 dockerd[1720]: time="2024-09-04T17:52:48.039656425Z" level=info msg="Loading containers: done." Sep 4 17:52:48.085327 dockerd[1720]: time="2024-09-04T17:52:48.085096868Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 17:52:48.085792 dockerd[1720]: time="2024-09-04T17:52:48.085435764Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 4 17:52:48.085792 dockerd[1720]: time="2024-09-04T17:52:48.085681455Z" level=info msg="Daemon has completed initialization" Sep 4 17:52:48.183310 dockerd[1720]: time="2024-09-04T17:52:48.182563467Z" level=info msg="API listen on /run/docker.sock" Sep 4 17:52:48.184023 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 17:52:49.961493 containerd[1449]: time="2024-09-04T17:52:49.961151926Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.28.13\"" Sep 4 17:52:50.726311 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1233897868.mount: Deactivated successfully. Sep 4 17:52:52.430357 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 4 17:52:52.437071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:52:52.576993 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:52:52.581493 (kubelet)[1928]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 17:52:52.637726 kubelet[1928]: E0904 17:52:52.637206 1928 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 17:52:52.639937 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 17:52:52.640102 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 17:52:53.131077 containerd[1449]: time="2024-09-04T17:52:53.131007452Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.28.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:53.133003 containerd[1449]: time="2024-09-04T17:52:53.132618485Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.28.13: active requests=0, bytes read=34530743" Sep 4 17:52:53.133899 containerd[1449]: time="2024-09-04T17:52:53.133855087Z" level=info msg="ImageCreate event name:\"sha256:5447bb21fa283749e558782cbef636f1991732f1b8f345296a5204ccf0b5f7b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:53.137839 containerd[1449]: time="2024-09-04T17:52:53.137758787Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:7d2c9256ad576a0b3745b749efe7f4fa8b276ec7ef448fc0f45794ca78eb8625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:53.139222 containerd[1449]: time="2024-09-04T17:52:53.139191104Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.28.13\" with image id \"sha256:5447bb21fa283749e558782cbef636f1991732f1b8f345296a5204ccf0b5f7b7\", repo tag \"registry.k8s.io/kube-apiserver:v1.28.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:7d2c9256ad576a0b3745b749efe7f4fa8b276ec7ef448fc0f45794ca78eb8625\", size \"34527535\" in 3.177957646s" Sep 4 17:52:53.139343 containerd[1449]: time="2024-09-04T17:52:53.139321919Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.28.13\" returns image reference \"sha256:5447bb21fa283749e558782cbef636f1991732f1b8f345296a5204ccf0b5f7b7\"" Sep 4 17:52:53.169667 containerd[1449]: time="2024-09-04T17:52:53.169374213Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.28.13\"" Sep 4 17:52:57.191974 containerd[1449]: time="2024-09-04T17:52:57.191607534Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.28.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:57.197060 containerd[1449]: time="2024-09-04T17:52:57.196903674Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.28.13: active requests=0, bytes read=31849717" Sep 4 17:52:57.200178 containerd[1449]: time="2024-09-04T17:52:57.200057556Z" level=info msg="ImageCreate event name:\"sha256:f1a0a396058d414b391ade9dba6e95d7a71ee665b09fc0fc420126ac21c155a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:57.212062 containerd[1449]: time="2024-09-04T17:52:57.211941582Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:e7b44c1741fe1802d159ffdbd0d1f78d48a4185d7fb1cdf8a112fbb50696f7e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:57.214081 containerd[1449]: time="2024-09-04T17:52:57.214039348Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.28.13\" with image id \"sha256:f1a0a396058d414b391ade9dba6e95d7a71ee665b09fc0fc420126ac21c155a5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.28.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:e7b44c1741fe1802d159ffdbd0d1f78d48a4185d7fb1cdf8a112fbb50696f7e1\", size \"33399655\" in 4.044609901s" Sep 4 17:52:57.214362 containerd[1449]: time="2024-09-04T17:52:57.214205328Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.28.13\" returns image reference \"sha256:f1a0a396058d414b391ade9dba6e95d7a71ee665b09fc0fc420126ac21c155a5\"" Sep 4 17:52:57.260644 containerd[1449]: time="2024-09-04T17:52:57.260558472Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.28.13\"" Sep 4 17:52:59.241896 containerd[1449]: time="2024-09-04T17:52:59.240184898Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.28.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:59.243178 containerd[1449]: time="2024-09-04T17:52:59.242948491Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.28.13: active requests=0, bytes read=17097785" Sep 4 17:52:59.244891 containerd[1449]: time="2024-09-04T17:52:59.244772635Z" level=info msg="ImageCreate event name:\"sha256:a60f64c0f37d085a5fcafef1b2a7adc9be95184dae7d8a5d1dbf6ca4681d328a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:59.262895 containerd[1449]: time="2024-09-04T17:52:59.261917405Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:efeb791718f4b9c62bd683f5b403da520f3651cb36ad9f800e0f98b595beafa4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:52:59.266644 containerd[1449]: time="2024-09-04T17:52:59.266540728Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.28.13\" with image id \"sha256:a60f64c0f37d085a5fcafef1b2a7adc9be95184dae7d8a5d1dbf6ca4681d328a\", repo tag \"registry.k8s.io/kube-scheduler:v1.28.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:efeb791718f4b9c62bd683f5b403da520f3651cb36ad9f800e0f98b595beafa4\", size \"18647741\" in 2.005880707s" Sep 4 17:52:59.267075 containerd[1449]: time="2024-09-04T17:52:59.267009646Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.28.13\" returns image reference \"sha256:a60f64c0f37d085a5fcafef1b2a7adc9be95184dae7d8a5d1dbf6ca4681d328a\"" Sep 4 17:52:59.320479 containerd[1449]: time="2024-09-04T17:52:59.320324335Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.28.13\"" Sep 4 17:53:00.888946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount996315045.mount: Deactivated successfully. Sep 4 17:53:01.750707 containerd[1449]: time="2024-09-04T17:53:01.748536280Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.28.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:01.750707 containerd[1449]: time="2024-09-04T17:53:01.750396201Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.28.13: active requests=0, bytes read=28303457" Sep 4 17:53:01.752484 containerd[1449]: time="2024-09-04T17:53:01.752403520Z" level=info msg="ImageCreate event name:\"sha256:31fde28e72a31599555ab5aba850caa90b9254b760b1007bfb662d086bb672fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:01.758104 containerd[1449]: time="2024-09-04T17:53:01.758049009Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:537633f399f87ce85d44fc8471ece97a83632198f99b3f7e08770beca95e9fa1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:01.759846 containerd[1449]: time="2024-09-04T17:53:01.759728403Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.28.13\" with image id \"sha256:31fde28e72a31599555ab5aba850caa90b9254b760b1007bfb662d086bb672fc\", repo tag \"registry.k8s.io/kube-proxy:v1.28.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:537633f399f87ce85d44fc8471ece97a83632198f99b3f7e08770beca95e9fa1\", size \"28302468\" in 2.439325721s" Sep 4 17:53:01.759966 containerd[1449]: time="2024-09-04T17:53:01.759850552Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.28.13\" returns image reference \"sha256:31fde28e72a31599555ab5aba850caa90b9254b760b1007bfb662d086bb672fc\"" Sep 4 17:53:01.817112 containerd[1449]: time="2024-09-04T17:53:01.817044988Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Sep 4 17:53:01.845237 update_engine[1430]: I0904 17:53:01.845151 1430 update_attempter.cc:509] Updating boot flags... Sep 4 17:53:01.893851 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1978) Sep 4 17:53:01.954197 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1980) Sep 4 17:53:02.438661 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2354028587.mount: Deactivated successfully. Sep 4 17:53:02.458235 containerd[1449]: time="2024-09-04T17:53:02.458109105Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:02.460141 containerd[1449]: time="2024-09-04T17:53:02.459977022Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Sep 4 17:53:02.462468 containerd[1449]: time="2024-09-04T17:53:02.462385482Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:02.468919 containerd[1449]: time="2024-09-04T17:53:02.468730442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:02.471748 containerd[1449]: time="2024-09-04T17:53:02.470883554Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 653.457192ms" Sep 4 17:53:02.471748 containerd[1449]: time="2024-09-04T17:53:02.470961770Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Sep 4 17:53:02.518344 containerd[1449]: time="2024-09-04T17:53:02.518278474Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\"" Sep 4 17:53:02.679889 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 4 17:53:02.693512 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:53:02.923773 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:53:02.928955 (kubelet)[2001]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 17:53:02.997304 kubelet[2001]: E0904 17:53:02.997171 2001 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 17:53:03.001101 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 17:53:03.001456 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 17:53:03.790831 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1944512423.mount: Deactivated successfully. Sep 4 17:53:06.920774 containerd[1449]: time="2024-09-04T17:53:06.920627806Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:06.922081 containerd[1449]: time="2024-09-04T17:53:06.922036794Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=56651633" Sep 4 17:53:06.923078 containerd[1449]: time="2024-09-04T17:53:06.923006210Z" level=info msg="ImageCreate event name:\"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:06.926684 containerd[1449]: time="2024-09-04T17:53:06.926613767Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:06.928162 containerd[1449]: time="2024-09-04T17:53:06.927998351Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"56649232\" in 4.409663222s" Sep 4 17:53:06.928162 containerd[1449]: time="2024-09-04T17:53:06.928037705Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\"" Sep 4 17:53:06.957730 containerd[1449]: time="2024-09-04T17:53:06.957473971Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.10.1\"" Sep 4 17:53:07.688397 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3614537476.mount: Deactivated successfully. Sep 4 17:53:08.813677 containerd[1449]: time="2024-09-04T17:53:08.813440411Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:08.815660 containerd[1449]: time="2024-09-04T17:53:08.815538653Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.10.1: active requests=0, bytes read=16191757" Sep 4 17:53:08.817274 containerd[1449]: time="2024-09-04T17:53:08.817169177Z" level=info msg="ImageCreate event name:\"sha256:ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:08.822414 containerd[1449]: time="2024-09-04T17:53:08.822318152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:a0ead06651cf580044aeb0a0feba63591858fb2e43ade8c9dea45a6a89ae7e5e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:08.824765 containerd[1449]: time="2024-09-04T17:53:08.824472459Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.10.1\" with image id \"sha256:ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc\", repo tag \"registry.k8s.io/coredns/coredns:v1.10.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:a0ead06651cf580044aeb0a0feba63591858fb2e43ade8c9dea45a6a89ae7e5e\", size \"16190758\" in 1.866940289s" Sep 4 17:53:08.824765 containerd[1449]: time="2024-09-04T17:53:08.824563470Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.10.1\" returns image reference \"sha256:ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc\"" Sep 4 17:53:12.844614 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:53:12.862427 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:53:12.906941 systemd[1]: Reloading requested from client PID 2133 ('systemctl') (unit session-11.scope)... Sep 4 17:53:12.906981 systemd[1]: Reloading... Sep 4 17:53:13.016841 zram_generator::config[2168]: No configuration found. Sep 4 17:53:13.168565 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 17:53:13.255478 systemd[1]: Reloading finished in 347 ms. Sep 4 17:53:13.314981 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 17:53:13.315092 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 17:53:13.315492 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:53:13.320077 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:53:13.444098 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:53:13.444554 (kubelet)[2239]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 17:53:13.518034 kubelet[2239]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:53:13.518034 kubelet[2239]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 17:53:13.518034 kubelet[2239]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:53:13.518449 kubelet[2239]: I0904 17:53:13.518169 2239 server.go:203] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 17:53:14.579868 kubelet[2239]: I0904 17:53:14.579324 2239 server.go:467] "Kubelet version" kubeletVersion="v1.28.7" Sep 4 17:53:14.579868 kubelet[2239]: I0904 17:53:14.579412 2239 server.go:469] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 17:53:14.581850 kubelet[2239]: I0904 17:53:14.581296 2239 server.go:895] "Client rotation is on, will bootstrap in background" Sep 4 17:53:14.615422 kubelet[2239]: I0904 17:53:14.615098 2239 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 17:53:14.615422 kubelet[2239]: E0904 17:53:14.615377 2239 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.122:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.634684 kubelet[2239]: I0904 17:53:14.634355 2239 server.go:725] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 17:53:14.635984 kubelet[2239]: I0904 17:53:14.635591 2239 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 17:53:14.635984 kubelet[2239]: I0904 17:53:14.635946 2239 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Sep 4 17:53:14.636891 kubelet[2239]: I0904 17:53:14.636876 2239 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 17:53:14.637200 kubelet[2239]: I0904 17:53:14.637185 2239 container_manager_linux.go:301] "Creating device plugin manager" Sep 4 17:53:14.638486 kubelet[2239]: I0904 17:53:14.638469 2239 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:53:14.640409 kubelet[2239]: I0904 17:53:14.640395 2239 kubelet.go:393] "Attempting to sync node with API server" Sep 4 17:53:14.640947 kubelet[2239]: I0904 17:53:14.640935 2239 kubelet.go:298] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 17:53:14.641045 kubelet[2239]: I0904 17:53:14.641035 2239 kubelet.go:309] "Adding apiserver pod source" Sep 4 17:53:14.641110 kubelet[2239]: I0904 17:53:14.641102 2239 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 17:53:14.641355 kubelet[2239]: W0904 17:53:14.641234 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4054-1-0-c-33e05803e0.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.641403 kubelet[2239]: E0904 17:53:14.641382 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4054-1-0-c-33e05803e0.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.642666 kubelet[2239]: W0904 17:53:14.642608 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.643462 kubelet[2239]: E0904 17:53:14.643051 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.643462 kubelet[2239]: I0904 17:53:14.643160 2239 kuberuntime_manager.go:257] "Container runtime initialized" containerRuntime="containerd" version="v1.7.20" apiVersion="v1" Sep 4 17:53:14.647837 kubelet[2239]: W0904 17:53:14.646717 2239 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 17:53:14.647837 kubelet[2239]: I0904 17:53:14.647309 2239 server.go:1232] "Started kubelet" Sep 4 17:53:14.649783 kubelet[2239]: I0904 17:53:14.649768 2239 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 17:53:14.655855 kubelet[2239]: I0904 17:53:14.655394 2239 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 17:53:14.655855 kubelet[2239]: I0904 17:53:14.655756 2239 ratelimit.go:65] "Setting rate limiting for podresources endpoint" qps=100 burstTokens=10 Sep 4 17:53:14.656956 kubelet[2239]: I0904 17:53:14.656938 2239 server.go:462] "Adding debug handlers to kubelet server" Sep 4 17:53:14.658789 kubelet[2239]: I0904 17:53:14.657365 2239 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 17:53:14.658789 kubelet[2239]: I0904 17:53:14.658171 2239 volume_manager.go:291] "Starting Kubelet Volume Manager" Sep 4 17:53:14.660888 kubelet[2239]: I0904 17:53:14.660874 2239 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Sep 4 17:53:14.661060 kubelet[2239]: I0904 17:53:14.661024 2239 reconciler_new.go:29] "Reconciler: start to sync state" Sep 4 17:53:14.668034 kubelet[2239]: E0904 17:53:14.668012 2239 controller.go:146] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4054-1-0-c-33e05803e0.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="200ms" Sep 4 17:53:14.668260 kubelet[2239]: E0904 17:53:14.668249 2239 cri_stats_provider.go:448] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Sep 4 17:53:14.668331 kubelet[2239]: E0904 17:53:14.668323 2239 kubelet.go:1431] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 17:53:14.669743 kubelet[2239]: E0904 17:53:14.669650 2239 event.go:289] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-4054-1-0-c-33e05803e0.novalocal.17f21c0040b1b070", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-4054-1-0-c-33e05803e0.novalocal", UID:"ci-4054-1-0-c-33e05803e0.novalocal", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-4054-1-0-c-33e05803e0.novalocal"}, FirstTimestamp:time.Date(2024, time.September, 4, 17, 53, 14, 647285872, time.Local), LastTimestamp:time.Date(2024, time.September, 4, 17, 53, 14, 647285872, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"kubelet", ReportingInstance:"ci-4054-1-0-c-33e05803e0.novalocal"}': 'Post "https://172.24.4.122:6443/api/v1/namespaces/default/events": dial tcp 172.24.4.122:6443: connect: connection refused'(may retry after sleeping) Sep 4 17:53:14.671305 kubelet[2239]: W0904 17:53:14.671268 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.671413 kubelet[2239]: E0904 17:53:14.671395 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.691073 kubelet[2239]: I0904 17:53:14.691024 2239 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 17:53:14.692676 kubelet[2239]: I0904 17:53:14.692652 2239 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 17:53:14.692745 kubelet[2239]: I0904 17:53:14.692694 2239 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 17:53:14.692745 kubelet[2239]: I0904 17:53:14.692724 2239 kubelet.go:2303] "Starting kubelet main sync loop" Sep 4 17:53:14.692922 kubelet[2239]: E0904 17:53:14.692881 2239 kubelet.go:2327] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 17:53:14.695391 kubelet[2239]: W0904 17:53:14.695329 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.695466 kubelet[2239]: E0904 17:53:14.695388 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:14.720301 kubelet[2239]: I0904 17:53:14.720279 2239 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 17:53:14.720463 kubelet[2239]: I0904 17:53:14.720454 2239 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 17:53:14.720527 kubelet[2239]: I0904 17:53:14.720520 2239 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:53:14.727205 kubelet[2239]: I0904 17:53:14.726914 2239 policy_none.go:49] "None policy: Start" Sep 4 17:53:14.727757 kubelet[2239]: I0904 17:53:14.727741 2239 memory_manager.go:169] "Starting memorymanager" policy="None" Sep 4 17:53:14.728220 kubelet[2239]: I0904 17:53:14.727941 2239 state_mem.go:35] "Initializing new in-memory state store" Sep 4 17:53:14.734907 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 17:53:14.753518 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 17:53:14.757004 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 17:53:14.763839 kubelet[2239]: I0904 17:53:14.763516 2239 kubelet_node_status.go:70] "Attempting to register node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.764134 kubelet[2239]: E0904 17:53:14.764113 2239 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.764413 kubelet[2239]: I0904 17:53:14.764400 2239 manager.go:471] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 17:53:14.765717 kubelet[2239]: I0904 17:53:14.765641 2239 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 17:53:14.768463 kubelet[2239]: E0904 17:53:14.768445 2239 eviction_manager.go:258] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4054-1-0-c-33e05803e0.novalocal\" not found" Sep 4 17:53:14.794046 kubelet[2239]: I0904 17:53:14.794011 2239 topology_manager.go:215] "Topology Admit Handler" podUID="67c547d24724cb3e44fa2e524235c7d6" podNamespace="kube-system" podName="kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.796316 kubelet[2239]: I0904 17:53:14.796201 2239 topology_manager.go:215] "Topology Admit Handler" podUID="795be409e734dd0f941de1e602667f21" podNamespace="kube-system" podName="kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.798379 kubelet[2239]: I0904 17:53:14.798200 2239 topology_manager.go:215] "Topology Admit Handler" podUID="8acb1eeacfc605f9c841a34a10a34551" podNamespace="kube-system" podName="kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.809779 systemd[1]: Created slice kubepods-burstable-pod67c547d24724cb3e44fa2e524235c7d6.slice - libcontainer container kubepods-burstable-pod67c547d24724cb3e44fa2e524235c7d6.slice. Sep 4 17:53:14.830743 systemd[1]: Created slice kubepods-burstable-pod795be409e734dd0f941de1e602667f21.slice - libcontainer container kubepods-burstable-pod795be409e734dd0f941de1e602667f21.slice. Sep 4 17:53:14.840726 systemd[1]: Created slice kubepods-burstable-pod8acb1eeacfc605f9c841a34a10a34551.slice - libcontainer container kubepods-burstable-pod8acb1eeacfc605f9c841a34a10a34551.slice. Sep 4 17:53:14.862449 kubelet[2239]: I0904 17:53:14.862381 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/67c547d24724cb3e44fa2e524235c7d6-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"67c547d24724cb3e44fa2e524235c7d6\") " pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.862573 kubelet[2239]: I0904 17:53:14.862478 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-k8s-certs\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.862573 kubelet[2239]: I0904 17:53:14.862545 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-kubeconfig\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.862651 kubelet[2239]: I0904 17:53:14.862609 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.863021 kubelet[2239]: I0904 17:53:14.862694 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/67c547d24724cb3e44fa2e524235c7d6-ca-certs\") pod \"kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"67c547d24724cb3e44fa2e524235c7d6\") " pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.863021 kubelet[2239]: I0904 17:53:14.862857 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/67c547d24724cb3e44fa2e524235c7d6-k8s-certs\") pod \"kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"67c547d24724cb3e44fa2e524235c7d6\") " pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.863021 kubelet[2239]: I0904 17:53:14.862904 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-ca-certs\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.863021 kubelet[2239]: I0904 17:53:14.862936 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-flexvolume-dir\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.863021 kubelet[2239]: I0904 17:53:14.862964 2239 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8acb1eeacfc605f9c841a34a10a34551-kubeconfig\") pod \"kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"8acb1eeacfc605f9c841a34a10a34551\") " pod="kube-system/kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.869770 kubelet[2239]: E0904 17:53:14.869725 2239 controller.go:146] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4054-1-0-c-33e05803e0.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="400ms" Sep 4 17:53:14.967787 kubelet[2239]: I0904 17:53:14.967607 2239 kubelet_node_status.go:70] "Attempting to register node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:14.968520 kubelet[2239]: E0904 17:53:14.968442 2239 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:15.130455 containerd[1449]: time="2024-09-04T17:53:15.128834018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal,Uid:67c547d24724cb3e44fa2e524235c7d6,Namespace:kube-system,Attempt:0,}" Sep 4 17:53:15.152495 containerd[1449]: time="2024-09-04T17:53:15.152419771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal,Uid:795be409e734dd0f941de1e602667f21,Namespace:kube-system,Attempt:0,}" Sep 4 17:53:15.157518 containerd[1449]: time="2024-09-04T17:53:15.157416054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal,Uid:8acb1eeacfc605f9c841a34a10a34551,Namespace:kube-system,Attempt:0,}" Sep 4 17:53:15.271058 kubelet[2239]: E0904 17:53:15.270916 2239 controller.go:146] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4054-1-0-c-33e05803e0.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="800ms" Sep 4 17:53:15.373309 kubelet[2239]: I0904 17:53:15.373194 2239 kubelet_node_status.go:70] "Attempting to register node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:15.375693 kubelet[2239]: E0904 17:53:15.375376 2239 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:15.445895 kubelet[2239]: W0904 17:53:15.445641 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4054-1-0-c-33e05803e0.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:15.445895 kubelet[2239]: E0904 17:53:15.445772 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.122:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4054-1-0-c-33e05803e0.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:15.792383 kubelet[2239]: W0904 17:53:15.790730 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:15.792383 kubelet[2239]: E0904 17:53:15.790904 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.122:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:15.819697 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2689592552.mount: Deactivated successfully. Sep 4 17:53:15.828909 containerd[1449]: time="2024-09-04T17:53:15.828781046Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:53:15.831869 containerd[1449]: time="2024-09-04T17:53:15.831741334Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:53:15.833399 containerd[1449]: time="2024-09-04T17:53:15.833036140Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 4 17:53:15.834083 kubelet[2239]: W0904 17:53:15.833985 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:15.834408 containerd[1449]: time="2024-09-04T17:53:15.834261828Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:53:15.834864 kubelet[2239]: E0904 17:53:15.834365 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.122:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:15.836606 containerd[1449]: time="2024-09-04T17:53:15.836531051Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Sep 4 17:53:15.838852 containerd[1449]: time="2024-09-04T17:53:15.837576580Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:53:15.838852 containerd[1449]: time="2024-09-04T17:53:15.837749314Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 4 17:53:15.846666 containerd[1449]: time="2024-09-04T17:53:15.846559726Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:53:15.851320 containerd[1449]: time="2024-09-04T17:53:15.851229628Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 698.36886ms" Sep 4 17:53:15.856959 containerd[1449]: time="2024-09-04T17:53:15.856888914Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 699.214886ms" Sep 4 17:53:15.880481 containerd[1449]: time="2024-09-04T17:53:15.880374820Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 750.200511ms" Sep 4 17:53:15.952398 kubelet[2239]: W0904 17:53:15.952329 2239 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:15.952633 kubelet[2239]: E0904 17:53:15.952609 2239 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.122:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:16.073440 kubelet[2239]: E0904 17:53:16.072352 2239 controller.go:146] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.122:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4054-1-0-c-33e05803e0.novalocal?timeout=10s\": dial tcp 172.24.4.122:6443: connect: connection refused" interval="1.6s" Sep 4 17:53:16.181860 kubelet[2239]: I0904 17:53:16.181227 2239 kubelet_node_status.go:70] "Attempting to register node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:16.182445 kubelet[2239]: E0904 17:53:16.182417 2239 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://172.24.4.122:6443/api/v1/nodes\": dial tcp 172.24.4.122:6443: connect: connection refused" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:16.228855 containerd[1449]: time="2024-09-04T17:53:16.227520940Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:53:16.228855 containerd[1449]: time="2024-09-04T17:53:16.227652276Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:53:16.228855 containerd[1449]: time="2024-09-04T17:53:16.227725804Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:16.230421 containerd[1449]: time="2024-09-04T17:53:16.229013777Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:16.234900 containerd[1449]: time="2024-09-04T17:53:16.234498878Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:53:16.234900 containerd[1449]: time="2024-09-04T17:53:16.234567036Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:53:16.234900 containerd[1449]: time="2024-09-04T17:53:16.234585861Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:16.234900 containerd[1449]: time="2024-09-04T17:53:16.234708170Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:16.252124 containerd[1449]: time="2024-09-04T17:53:16.251428408Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:53:16.252124 containerd[1449]: time="2024-09-04T17:53:16.251544304Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:53:16.252124 containerd[1449]: time="2024-09-04T17:53:16.251578859Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:16.252124 containerd[1449]: time="2024-09-04T17:53:16.251743799Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:16.274038 systemd[1]: Started cri-containerd-14570492204ef25437caa39eb5914e2dd56a0217da1e8e7130c9a705cd90c916.scope - libcontainer container 14570492204ef25437caa39eb5914e2dd56a0217da1e8e7130c9a705cd90c916. Sep 4 17:53:16.275228 systemd[1]: Started cri-containerd-83422e20efa0c3e62a7424fcd86468bda315d19151b65b694aa8f5eec219fe8b.scope - libcontainer container 83422e20efa0c3e62a7424fcd86468bda315d19151b65b694aa8f5eec219fe8b. Sep 4 17:53:16.290991 systemd[1]: Started cri-containerd-15de3ee1d2a408e447287e787a188a896816fb440a92d76862b7033e62a49e1e.scope - libcontainer container 15de3ee1d2a408e447287e787a188a896816fb440a92d76862b7033e62a49e1e. Sep 4 17:53:16.348915 containerd[1449]: time="2024-09-04T17:53:16.348598115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal,Uid:8acb1eeacfc605f9c841a34a10a34551,Namespace:kube-system,Attempt:0,} returns sandbox id \"14570492204ef25437caa39eb5914e2dd56a0217da1e8e7130c9a705cd90c916\"" Sep 4 17:53:16.360072 containerd[1449]: time="2024-09-04T17:53:16.359775234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal,Uid:795be409e734dd0f941de1e602667f21,Namespace:kube-system,Attempt:0,} returns sandbox id \"83422e20efa0c3e62a7424fcd86468bda315d19151b65b694aa8f5eec219fe8b\"" Sep 4 17:53:16.374053 containerd[1449]: time="2024-09-04T17:53:16.373990487Z" level=info msg="CreateContainer within sandbox \"83422e20efa0c3e62a7424fcd86468bda315d19151b65b694aa8f5eec219fe8b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 17:53:16.374723 containerd[1449]: time="2024-09-04T17:53:16.374513076Z" level=info msg="CreateContainer within sandbox \"14570492204ef25437caa39eb5914e2dd56a0217da1e8e7130c9a705cd90c916\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 17:53:16.393205 containerd[1449]: time="2024-09-04T17:53:16.393166436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal,Uid:67c547d24724cb3e44fa2e524235c7d6,Namespace:kube-system,Attempt:0,} returns sandbox id \"15de3ee1d2a408e447287e787a188a896816fb440a92d76862b7033e62a49e1e\"" Sep 4 17:53:16.397660 containerd[1449]: time="2024-09-04T17:53:16.397617168Z" level=info msg="CreateContainer within sandbox \"15de3ee1d2a408e447287e787a188a896816fb440a92d76862b7033e62a49e1e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 17:53:16.414661 containerd[1449]: time="2024-09-04T17:53:16.414585450Z" level=info msg="CreateContainer within sandbox \"83422e20efa0c3e62a7424fcd86468bda315d19151b65b694aa8f5eec219fe8b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d1cd504ca014599f32860a357894805ee69f4dec42e78f2004f713a59796cfd1\"" Sep 4 17:53:16.415886 containerd[1449]: time="2024-09-04T17:53:16.415783816Z" level=info msg="StartContainer for \"d1cd504ca014599f32860a357894805ee69f4dec42e78f2004f713a59796cfd1\"" Sep 4 17:53:16.421115 containerd[1449]: time="2024-09-04T17:53:16.421062781Z" level=info msg="CreateContainer within sandbox \"14570492204ef25437caa39eb5914e2dd56a0217da1e8e7130c9a705cd90c916\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fdee20eb8ab98250dd757842d31a1ec88edc42f2e23e9c780b0365686a783936\"" Sep 4 17:53:16.421987 containerd[1449]: time="2024-09-04T17:53:16.421879942Z" level=info msg="StartContainer for \"fdee20eb8ab98250dd757842d31a1ec88edc42f2e23e9c780b0365686a783936\"" Sep 4 17:53:16.445622 containerd[1449]: time="2024-09-04T17:53:16.445564011Z" level=info msg="CreateContainer within sandbox \"15de3ee1d2a408e447287e787a188a896816fb440a92d76862b7033e62a49e1e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d0aceab04cd79b6dc58fb160208b95319e9ee46408474935b79a4dd5fe233006\"" Sep 4 17:53:16.447372 containerd[1449]: time="2024-09-04T17:53:16.447266883Z" level=info msg="StartContainer for \"d0aceab04cd79b6dc58fb160208b95319e9ee46408474935b79a4dd5fe233006\"" Sep 4 17:53:16.461121 systemd[1]: Started cri-containerd-fdee20eb8ab98250dd757842d31a1ec88edc42f2e23e9c780b0365686a783936.scope - libcontainer container fdee20eb8ab98250dd757842d31a1ec88edc42f2e23e9c780b0365686a783936. Sep 4 17:53:16.467984 systemd[1]: Started cri-containerd-d1cd504ca014599f32860a357894805ee69f4dec42e78f2004f713a59796cfd1.scope - libcontainer container d1cd504ca014599f32860a357894805ee69f4dec42e78f2004f713a59796cfd1. Sep 4 17:53:16.498188 systemd[1]: Started cri-containerd-d0aceab04cd79b6dc58fb160208b95319e9ee46408474935b79a4dd5fe233006.scope - libcontainer container d0aceab04cd79b6dc58fb160208b95319e9ee46408474935b79a4dd5fe233006. Sep 4 17:53:16.545307 containerd[1449]: time="2024-09-04T17:53:16.544594577Z" level=info msg="StartContainer for \"fdee20eb8ab98250dd757842d31a1ec88edc42f2e23e9c780b0365686a783936\" returns successfully" Sep 4 17:53:16.560852 containerd[1449]: time="2024-09-04T17:53:16.560756778Z" level=info msg="StartContainer for \"d1cd504ca014599f32860a357894805ee69f4dec42e78f2004f713a59796cfd1\" returns successfully" Sep 4 17:53:16.579166 containerd[1449]: time="2024-09-04T17:53:16.578683637Z" level=info msg="StartContainer for \"d0aceab04cd79b6dc58fb160208b95319e9ee46408474935b79a4dd5fe233006\" returns successfully" Sep 4 17:53:16.650668 kubelet[2239]: E0904 17:53:16.650026 2239 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.122:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.122:6443: connect: connection refused Sep 4 17:53:17.784833 kubelet[2239]: I0904 17:53:17.784451 2239 kubelet_node_status.go:70] "Attempting to register node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:18.850222 kubelet[2239]: I0904 17:53:18.850137 2239 kubelet_node_status.go:73] "Successfully registered node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:19.036606 kubelet[2239]: E0904 17:53:19.036543 2239 kubelet.go:1890] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:19.644460 kubelet[2239]: I0904 17:53:19.644372 2239 apiserver.go:52] "Watching apiserver" Sep 4 17:53:19.661875 kubelet[2239]: I0904 17:53:19.661706 2239 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Sep 4 17:53:21.982689 kubelet[2239]: W0904 17:53:21.982478 2239 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 17:53:22.161549 systemd[1]: Reloading requested from client PID 2522 ('systemctl') (unit session-11.scope)... Sep 4 17:53:22.161888 systemd[1]: Reloading... Sep 4 17:53:22.291837 zram_generator::config[2556]: No configuration found. Sep 4 17:53:22.524108 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 4 17:53:22.633867 systemd[1]: Reloading finished in 471 ms. Sep 4 17:53:22.679885 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:53:22.680275 kubelet[2239]: I0904 17:53:22.680217 2239 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 17:53:22.697282 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 17:53:22.697531 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:53:22.697589 systemd[1]: kubelet.service: Consumed 1.710s CPU time, 108.6M memory peak, 0B memory swap peak. Sep 4 17:53:22.705586 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:53:23.221139 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:53:23.229461 (kubelet)[2623]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 17:53:23.391759 kubelet[2623]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:53:23.393261 kubelet[2623]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 17:53:23.393261 kubelet[2623]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:53:23.393261 kubelet[2623]: I0904 17:53:23.392247 2623 server.go:203] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 17:53:23.399259 kubelet[2623]: I0904 17:53:23.399207 2623 server.go:467] "Kubelet version" kubeletVersion="v1.28.7" Sep 4 17:53:23.399259 kubelet[2623]: I0904 17:53:23.399255 2623 server.go:469] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 17:53:23.400200 kubelet[2623]: I0904 17:53:23.399531 2623 server.go:895] "Client rotation is on, will bootstrap in background" Sep 4 17:53:23.401809 kubelet[2623]: I0904 17:53:23.401219 2623 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 4 17:53:23.408863 kubelet[2623]: I0904 17:53:23.408127 2623 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 17:53:23.415955 kubelet[2623]: I0904 17:53:23.415909 2623 server.go:725] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 17:53:23.416208 kubelet[2623]: I0904 17:53:23.416194 2623 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 17:53:23.416396 kubelet[2623]: I0904 17:53:23.416374 2623 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Sep 4 17:53:23.416496 kubelet[2623]: I0904 17:53:23.416408 2623 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 17:53:23.416496 kubelet[2623]: I0904 17:53:23.416420 2623 container_manager_linux.go:301] "Creating device plugin manager" Sep 4 17:53:23.416496 kubelet[2623]: I0904 17:53:23.416457 2623 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:53:23.416579 kubelet[2623]: I0904 17:53:23.416563 2623 kubelet.go:393] "Attempting to sync node with API server" Sep 4 17:53:23.416620 kubelet[2623]: I0904 17:53:23.416588 2623 kubelet.go:298] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 17:53:23.416651 kubelet[2623]: I0904 17:53:23.416625 2623 kubelet.go:309] "Adding apiserver pod source" Sep 4 17:53:23.416651 kubelet[2623]: I0904 17:53:23.416641 2623 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 17:53:23.426167 kubelet[2623]: I0904 17:53:23.425979 2623 kuberuntime_manager.go:257] "Container runtime initialized" containerRuntime="containerd" version="v1.7.20" apiVersion="v1" Sep 4 17:53:23.428749 kubelet[2623]: I0904 17:53:23.428732 2623 server.go:1232] "Started kubelet" Sep 4 17:53:23.430354 kubelet[2623]: I0904 17:53:23.430187 2623 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 17:53:23.442623 kubelet[2623]: I0904 17:53:23.442551 2623 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 17:53:23.452441 kubelet[2623]: I0904 17:53:23.452144 2623 server.go:462] "Adding debug handlers to kubelet server" Sep 4 17:53:23.454861 kubelet[2623]: E0904 17:53:23.454667 2623 cri_stats_provider.go:448] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Sep 4 17:53:23.454861 kubelet[2623]: E0904 17:53:23.454700 2623 kubelet.go:1431] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 17:53:23.456685 kubelet[2623]: I0904 17:53:23.443644 2623 ratelimit.go:65] "Setting rate limiting for podresources endpoint" qps=100 burstTokens=10 Sep 4 17:53:23.457546 kubelet[2623]: I0904 17:53:23.457534 2623 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 17:53:23.457759 kubelet[2623]: I0904 17:53:23.445325 2623 volume_manager.go:291] "Starting Kubelet Volume Manager" Sep 4 17:53:23.457988 kubelet[2623]: I0904 17:53:23.445337 2623 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Sep 4 17:53:23.458507 kubelet[2623]: I0904 17:53:23.458426 2623 reconciler_new.go:29] "Reconciler: start to sync state" Sep 4 17:53:23.476894 kubelet[2623]: I0904 17:53:23.476669 2623 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 17:53:23.477857 kubelet[2623]: I0904 17:53:23.477524 2623 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 17:53:23.477857 kubelet[2623]: I0904 17:53:23.477543 2623 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 17:53:23.477857 kubelet[2623]: I0904 17:53:23.477561 2623 kubelet.go:2303] "Starting kubelet main sync loop" Sep 4 17:53:23.477857 kubelet[2623]: E0904 17:53:23.477611 2623 kubelet.go:2327] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 17:53:23.544002 kubelet[2623]: I0904 17:53:23.543946 2623 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 17:53:23.544002 kubelet[2623]: I0904 17:53:23.543974 2623 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 17:53:23.544002 kubelet[2623]: I0904 17:53:23.543992 2623 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:53:23.544261 kubelet[2623]: I0904 17:53:23.544160 2623 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 17:53:23.544261 kubelet[2623]: I0904 17:53:23.544183 2623 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 17:53:23.544261 kubelet[2623]: I0904 17:53:23.544192 2623 policy_none.go:49] "None policy: Start" Sep 4 17:53:23.547053 kubelet[2623]: I0904 17:53:23.545672 2623 memory_manager.go:169] "Starting memorymanager" policy="None" Sep 4 17:53:23.547053 kubelet[2623]: I0904 17:53:23.545699 2623 state_mem.go:35] "Initializing new in-memory state store" Sep 4 17:53:23.547053 kubelet[2623]: I0904 17:53:23.545920 2623 state_mem.go:75] "Updated machine memory state" Sep 4 17:53:23.552264 kubelet[2623]: I0904 17:53:23.551785 2623 manager.go:471] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 17:53:23.552264 kubelet[2623]: I0904 17:53:23.552062 2623 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 17:53:23.558414 kubelet[2623]: I0904 17:53:23.557869 2623 kubelet_node_status.go:70] "Attempting to register node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.578348 kubelet[2623]: I0904 17:53:23.578312 2623 topology_manager.go:215] "Topology Admit Handler" podUID="67c547d24724cb3e44fa2e524235c7d6" podNamespace="kube-system" podName="kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.578665 kubelet[2623]: I0904 17:53:23.578651 2623 topology_manager.go:215] "Topology Admit Handler" podUID="795be409e734dd0f941de1e602667f21" podNamespace="kube-system" podName="kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.578832 kubelet[2623]: I0904 17:53:23.578821 2623 topology_manager.go:215] "Topology Admit Handler" podUID="8acb1eeacfc605f9c841a34a10a34551" podNamespace="kube-system" podName="kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.583139 kubelet[2623]: I0904 17:53:23.583093 2623 kubelet_node_status.go:108] "Node was previously registered" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.583255 kubelet[2623]: I0904 17:53:23.583191 2623 kubelet_node_status.go:73] "Successfully registered node" node="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.592236 kubelet[2623]: W0904 17:53:23.592055 2623 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 17:53:23.592236 kubelet[2623]: E0904 17:53:23.592133 2623 kubelet.go:1890] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.592415 kubelet[2623]: W0904 17:53:23.592271 2623 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 17:53:23.593662 kubelet[2623]: W0904 17:53:23.592640 2623 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 4 17:53:23.760038 kubelet[2623]: I0904 17:53:23.759404 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/67c547d24724cb3e44fa2e524235c7d6-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"67c547d24724cb3e44fa2e524235c7d6\") " pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.760038 kubelet[2623]: I0904 17:53:23.759538 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-ca-certs\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.760038 kubelet[2623]: I0904 17:53:23.759601 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-kubeconfig\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.760038 kubelet[2623]: I0904 17:53:23.759663 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.760475 kubelet[2623]: I0904 17:53:23.759722 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/67c547d24724cb3e44fa2e524235c7d6-ca-certs\") pod \"kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"67c547d24724cb3e44fa2e524235c7d6\") " pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.760475 kubelet[2623]: I0904 17:53:23.759780 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/67c547d24724cb3e44fa2e524235c7d6-k8s-certs\") pod \"kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"67c547d24724cb3e44fa2e524235c7d6\") " pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.760475 kubelet[2623]: I0904 17:53:23.759942 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-flexvolume-dir\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.761837 kubelet[2623]: I0904 17:53:23.761732 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/795be409e734dd0f941de1e602667f21-k8s-certs\") pod \"kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"795be409e734dd0f941de1e602667f21\") " pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:23.761999 kubelet[2623]: I0904 17:53:23.761963 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8acb1eeacfc605f9c841a34a10a34551-kubeconfig\") pod \"kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal\" (UID: \"8acb1eeacfc605f9c841a34a10a34551\") " pod="kube-system/kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:53:24.419642 kubelet[2623]: I0904 17:53:24.419547 2623 apiserver.go:52] "Watching apiserver" Sep 4 17:53:24.458363 kubelet[2623]: I0904 17:53:24.458299 2623 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Sep 4 17:53:24.535521 kubelet[2623]: I0904 17:53:24.534865 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4054-1-0-c-33e05803e0.novalocal" podStartSLOduration=1.5325239320000001 podCreationTimestamp="2024-09-04 17:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-09-04 17:53:24.532326943 +0000 UTC m=+1.294972998" watchObservedRunningTime="2024-09-04 17:53:24.532523932 +0000 UTC m=+1.295169987" Sep 4 17:53:24.555286 kubelet[2623]: I0904 17:53:24.554904 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4054-1-0-c-33e05803e0.novalocal" podStartSLOduration=3.554856781 podCreationTimestamp="2024-09-04 17:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-09-04 17:53:24.553882735 +0000 UTC m=+1.316528810" watchObservedRunningTime="2024-09-04 17:53:24.554856781 +0000 UTC m=+1.317502836" Sep 4 17:53:24.555286 kubelet[2623]: I0904 17:53:24.555005 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4054-1-0-c-33e05803e0.novalocal" podStartSLOduration=1.554983488 podCreationTimestamp="2024-09-04 17:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-09-04 17:53:24.543341873 +0000 UTC m=+1.305987928" watchObservedRunningTime="2024-09-04 17:53:24.554983488 +0000 UTC m=+1.317629553" Sep 4 17:53:28.847483 sudo[1711]: pam_unix(sudo:session): session closed for user root Sep 4 17:53:29.100287 sshd[1708]: pam_unix(sshd:session): session closed for user core Sep 4 17:53:29.107325 systemd[1]: sshd@8-172.24.4.122:22-172.24.4.1:56542.service: Deactivated successfully. Sep 4 17:53:29.112525 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 17:53:29.113452 systemd[1]: session-11.scope: Consumed 7.514s CPU time, 135.1M memory peak, 0B memory swap peak. Sep 4 17:53:29.117503 systemd-logind[1429]: Session 11 logged out. Waiting for processes to exit. Sep 4 17:53:29.120466 systemd-logind[1429]: Removed session 11. Sep 4 17:53:37.288587 kubelet[2623]: I0904 17:53:37.288416 2623 kuberuntime_manager.go:1528] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 17:53:37.291452 kubelet[2623]: I0904 17:53:37.290925 2623 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 17:53:37.291500 containerd[1449]: time="2024-09-04T17:53:37.288986367Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 17:53:38.064841 kubelet[2623]: I0904 17:53:38.064756 2623 topology_manager.go:215] "Topology Admit Handler" podUID="d8b38917-d8ad-4490-b7ef-153727f73f36" podNamespace="kube-system" podName="kube-proxy-c2x7z" Sep 4 17:53:38.093134 systemd[1]: Created slice kubepods-besteffort-podd8b38917_d8ad_4490_b7ef_153727f73f36.slice - libcontainer container kubepods-besteffort-podd8b38917_d8ad_4490_b7ef_153727f73f36.slice. Sep 4 17:53:38.155004 kubelet[2623]: I0904 17:53:38.154959 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d8b38917-d8ad-4490-b7ef-153727f73f36-kube-proxy\") pod \"kube-proxy-c2x7z\" (UID: \"d8b38917-d8ad-4490-b7ef-153727f73f36\") " pod="kube-system/kube-proxy-c2x7z" Sep 4 17:53:38.155004 kubelet[2623]: I0904 17:53:38.155015 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8b38917-d8ad-4490-b7ef-153727f73f36-lib-modules\") pod \"kube-proxy-c2x7z\" (UID: \"d8b38917-d8ad-4490-b7ef-153727f73f36\") " pod="kube-system/kube-proxy-c2x7z" Sep 4 17:53:38.155247 kubelet[2623]: I0904 17:53:38.155042 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d8b38917-d8ad-4490-b7ef-153727f73f36-xtables-lock\") pod \"kube-proxy-c2x7z\" (UID: \"d8b38917-d8ad-4490-b7ef-153727f73f36\") " pod="kube-system/kube-proxy-c2x7z" Sep 4 17:53:38.155247 kubelet[2623]: I0904 17:53:38.155071 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2kx\" (UniqueName: \"kubernetes.io/projected/d8b38917-d8ad-4490-b7ef-153727f73f36-kube-api-access-7b2kx\") pod \"kube-proxy-c2x7z\" (UID: \"d8b38917-d8ad-4490-b7ef-153727f73f36\") " pod="kube-system/kube-proxy-c2x7z" Sep 4 17:53:38.346843 kubelet[2623]: I0904 17:53:38.346449 2623 topology_manager.go:215] "Topology Admit Handler" podUID="2bced832-42bd-41e8-a5c0-447880468bff" podNamespace="tigera-operator" podName="tigera-operator-5d56685c77-z5skp" Sep 4 17:53:38.352680 kubelet[2623]: W0904 17:53:38.352184 2623 reflector.go:535] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4054-1-0-c-33e05803e0.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4054-1-0-c-33e05803e0.novalocal' and this object Sep 4 17:53:38.352680 kubelet[2623]: E0904 17:53:38.352223 2623 reflector.go:147] object-"tigera-operator"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4054-1-0-c-33e05803e0.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4054-1-0-c-33e05803e0.novalocal' and this object Sep 4 17:53:38.352680 kubelet[2623]: W0904 17:53:38.352548 2623 reflector.go:535] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4054-1-0-c-33e05803e0.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4054-1-0-c-33e05803e0.novalocal' and this object Sep 4 17:53:38.353399 kubelet[2623]: E0904 17:53:38.352851 2623 reflector.go:147] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4054-1-0-c-33e05803e0.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4054-1-0-c-33e05803e0.novalocal' and this object Sep 4 17:53:38.357944 kubelet[2623]: I0904 17:53:38.357360 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2bced832-42bd-41e8-a5c0-447880468bff-var-lib-calico\") pod \"tigera-operator-5d56685c77-z5skp\" (UID: \"2bced832-42bd-41e8-a5c0-447880468bff\") " pod="tigera-operator/tigera-operator-5d56685c77-z5skp" Sep 4 17:53:38.357944 kubelet[2623]: I0904 17:53:38.357407 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqw8\" (UniqueName: \"kubernetes.io/projected/2bced832-42bd-41e8-a5c0-447880468bff-kube-api-access-ghqw8\") pod \"tigera-operator-5d56685c77-z5skp\" (UID: \"2bced832-42bd-41e8-a5c0-447880468bff\") " pod="tigera-operator/tigera-operator-5d56685c77-z5skp" Sep 4 17:53:38.361306 systemd[1]: Created slice kubepods-besteffort-pod2bced832_42bd_41e8_a5c0_447880468bff.slice - libcontainer container kubepods-besteffort-pod2bced832_42bd_41e8_a5c0_447880468bff.slice. Sep 4 17:53:38.405816 containerd[1449]: time="2024-09-04T17:53:38.405693501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c2x7z,Uid:d8b38917-d8ad-4490-b7ef-153727f73f36,Namespace:kube-system,Attempt:0,}" Sep 4 17:53:38.442200 containerd[1449]: time="2024-09-04T17:53:38.441672469Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:53:38.442200 containerd[1449]: time="2024-09-04T17:53:38.441781634Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:53:38.442200 containerd[1449]: time="2024-09-04T17:53:38.441841576Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:38.442200 containerd[1449]: time="2024-09-04T17:53:38.442009671Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:38.462206 systemd[1]: run-containerd-runc-k8s.io-7ddfa0fb7f2f93cc3437dc3bea6fc685bb37abe28551eb8c8f8f486a312a43cc-runc.AKIGXe.mount: Deactivated successfully. Sep 4 17:53:38.471067 systemd[1]: Started cri-containerd-7ddfa0fb7f2f93cc3437dc3bea6fc685bb37abe28551eb8c8f8f486a312a43cc.scope - libcontainer container 7ddfa0fb7f2f93cc3437dc3bea6fc685bb37abe28551eb8c8f8f486a312a43cc. Sep 4 17:53:38.500028 containerd[1449]: time="2024-09-04T17:53:38.499985627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c2x7z,Uid:d8b38917-d8ad-4490-b7ef-153727f73f36,Namespace:kube-system,Attempt:0,} returns sandbox id \"7ddfa0fb7f2f93cc3437dc3bea6fc685bb37abe28551eb8c8f8f486a312a43cc\"" Sep 4 17:53:38.504997 containerd[1449]: time="2024-09-04T17:53:38.504762975Z" level=info msg="CreateContainer within sandbox \"7ddfa0fb7f2f93cc3437dc3bea6fc685bb37abe28551eb8c8f8f486a312a43cc\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 17:53:38.551885 containerd[1449]: time="2024-09-04T17:53:38.551274574Z" level=info msg="CreateContainer within sandbox \"7ddfa0fb7f2f93cc3437dc3bea6fc685bb37abe28551eb8c8f8f486a312a43cc\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d729dc96f53b39fa067051e20196d937c926e2d5e3e8896b10dded7f92d29b04\"" Sep 4 17:53:38.554684 containerd[1449]: time="2024-09-04T17:53:38.554307501Z" level=info msg="StartContainer for \"d729dc96f53b39fa067051e20196d937c926e2d5e3e8896b10dded7f92d29b04\"" Sep 4 17:53:38.607118 systemd[1]: Started cri-containerd-d729dc96f53b39fa067051e20196d937c926e2d5e3e8896b10dded7f92d29b04.scope - libcontainer container d729dc96f53b39fa067051e20196d937c926e2d5e3e8896b10dded7f92d29b04. Sep 4 17:53:38.666421 containerd[1449]: time="2024-09-04T17:53:38.666369098Z" level=info msg="StartContainer for \"d729dc96f53b39fa067051e20196d937c926e2d5e3e8896b10dded7f92d29b04\" returns successfully" Sep 4 17:53:39.268486 containerd[1449]: time="2024-09-04T17:53:39.268422038Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5d56685c77-z5skp,Uid:2bced832-42bd-41e8-a5c0-447880468bff,Namespace:tigera-operator,Attempt:0,}" Sep 4 17:53:39.335949 containerd[1449]: time="2024-09-04T17:53:39.335310618Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:53:39.335949 containerd[1449]: time="2024-09-04T17:53:39.335412198Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:53:39.335949 containerd[1449]: time="2024-09-04T17:53:39.335446492Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:39.335949 containerd[1449]: time="2024-09-04T17:53:39.335606973Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:39.366043 systemd[1]: Started cri-containerd-559025fe75eda613d212ac0dc01ef4b3a12e01b62a8d81521a7662e5a2114d12.scope - libcontainer container 559025fe75eda613d212ac0dc01ef4b3a12e01b62a8d81521a7662e5a2114d12. Sep 4 17:53:39.423011 containerd[1449]: time="2024-09-04T17:53:39.422952435Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5d56685c77-z5skp,Uid:2bced832-42bd-41e8-a5c0-447880468bff,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"559025fe75eda613d212ac0dc01ef4b3a12e01b62a8d81521a7662e5a2114d12\"" Sep 4 17:53:39.429222 containerd[1449]: time="2024-09-04T17:53:39.428835877Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.34.3\"" Sep 4 17:53:41.119013 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2234411454.mount: Deactivated successfully. Sep 4 17:53:41.965878 containerd[1449]: time="2024-09-04T17:53:41.965783260Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:41.967373 containerd[1449]: time="2024-09-04T17:53:41.967273764Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.34.3: active requests=0, bytes read=22136525" Sep 4 17:53:41.968868 containerd[1449]: time="2024-09-04T17:53:41.968597817Z" level=info msg="ImageCreate event name:\"sha256:d4e6e064c25d51e66b2470e80d7b57004f79e2a76b37e83986577f8666da9736\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:41.971461 containerd[1449]: time="2024-09-04T17:53:41.971429958Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:2cc4de6ad019ccc3abbd2615c159d0dcfb2ecdab90dc5805f08837d7c014d458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:41.972168 containerd[1449]: time="2024-09-04T17:53:41.972135281Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.34.3\" with image id \"sha256:d4e6e064c25d51e66b2470e80d7b57004f79e2a76b37e83986577f8666da9736\", repo tag \"quay.io/tigera/operator:v1.34.3\", repo digest \"quay.io/tigera/operator@sha256:2cc4de6ad019ccc3abbd2615c159d0dcfb2ecdab90dc5805f08837d7c014d458\", size \"22130728\" in 2.543264398s" Sep 4 17:53:41.972221 containerd[1449]: time="2024-09-04T17:53:41.972167611Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.34.3\" returns image reference \"sha256:d4e6e064c25d51e66b2470e80d7b57004f79e2a76b37e83986577f8666da9736\"" Sep 4 17:53:41.974616 containerd[1449]: time="2024-09-04T17:53:41.974591136Z" level=info msg="CreateContainer within sandbox \"559025fe75eda613d212ac0dc01ef4b3a12e01b62a8d81521a7662e5a2114d12\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 17:53:41.991907 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1667113050.mount: Deactivated successfully. Sep 4 17:53:41.999238 containerd[1449]: time="2024-09-04T17:53:41.999111698Z" level=info msg="CreateContainer within sandbox \"559025fe75eda613d212ac0dc01ef4b3a12e01b62a8d81521a7662e5a2114d12\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a5b94a03475876b7972a3810d933fb37d1ec7f594dd25807c295352cbb33d085\"" Sep 4 17:53:42.000894 containerd[1449]: time="2024-09-04T17:53:41.999658363Z" level=info msg="StartContainer for \"a5b94a03475876b7972a3810d933fb37d1ec7f594dd25807c295352cbb33d085\"" Sep 4 17:53:42.035010 systemd[1]: Started cri-containerd-a5b94a03475876b7972a3810d933fb37d1ec7f594dd25807c295352cbb33d085.scope - libcontainer container a5b94a03475876b7972a3810d933fb37d1ec7f594dd25807c295352cbb33d085. Sep 4 17:53:42.069341 containerd[1449]: time="2024-09-04T17:53:42.069307263Z" level=info msg="StartContainer for \"a5b94a03475876b7972a3810d933fb37d1ec7f594dd25807c295352cbb33d085\" returns successfully" Sep 4 17:53:42.599369 kubelet[2623]: I0904 17:53:42.599251 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-c2x7z" podStartSLOduration=4.596773209 podCreationTimestamp="2024-09-04 17:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-09-04 17:53:39.580939363 +0000 UTC m=+16.343585458" watchObservedRunningTime="2024-09-04 17:53:42.596773209 +0000 UTC m=+19.359419305" Sep 4 17:53:42.601209 kubelet[2623]: I0904 17:53:42.600541 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5d56685c77-z5skp" podStartSLOduration=2.054418548 podCreationTimestamp="2024-09-04 17:53:38 +0000 UTC" firstStartedPulling="2024-09-04 17:53:39.42710916 +0000 UTC m=+16.189755205" lastFinishedPulling="2024-09-04 17:53:41.973092616 +0000 UTC m=+18.735738671" observedRunningTime="2024-09-04 17:53:42.596466284 +0000 UTC m=+19.359112389" watchObservedRunningTime="2024-09-04 17:53:42.600402014 +0000 UTC m=+19.363048170" Sep 4 17:53:45.605461 kubelet[2623]: I0904 17:53:45.603944 2623 topology_manager.go:215] "Topology Admit Handler" podUID="2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8" podNamespace="calico-system" podName="calico-typha-7cf89f49dc-pqj8f" Sep 4 17:53:45.613074 systemd[1]: Created slice kubepods-besteffort-pod2b729b5f_ebd2_4eea_be0d_5f0bd3f4cbe8.slice - libcontainer container kubepods-besteffort-pod2b729b5f_ebd2_4eea_be0d_5f0bd3f4cbe8.slice. Sep 4 17:53:45.688593 kubelet[2623]: I0904 17:53:45.688561 2623 topology_manager.go:215] "Topology Admit Handler" podUID="e4ba0508-e8ea-451a-83dc-7955cfbe4d70" podNamespace="calico-system" podName="calico-node-88pjh" Sep 4 17:53:45.698996 systemd[1]: Created slice kubepods-besteffort-pode4ba0508_e8ea_451a_83dc_7955cfbe4d70.slice - libcontainer container kubepods-besteffort-pode4ba0508_e8ea_451a_83dc_7955cfbe4d70.slice. Sep 4 17:53:45.709206 kubelet[2623]: I0904 17:53:45.709163 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-lib-modules\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709206 kubelet[2623]: I0904 17:53:45.709208 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-tigera-ca-bundle\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709364 kubelet[2623]: I0904 17:53:45.709234 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-var-lib-calico\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709364 kubelet[2623]: I0904 17:53:45.709262 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-var-run-calico\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709364 kubelet[2623]: I0904 17:53:45.709284 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-xtables-lock\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709364 kubelet[2623]: I0904 17:53:45.709306 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-cni-bin-dir\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709364 kubelet[2623]: I0904 17:53:45.709330 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-node-certs\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709529 kubelet[2623]: I0904 17:53:45.709359 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8-tigera-ca-bundle\") pod \"calico-typha-7cf89f49dc-pqj8f\" (UID: \"2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8\") " pod="calico-system/calico-typha-7cf89f49dc-pqj8f" Sep 4 17:53:45.709529 kubelet[2623]: I0904 17:53:45.709384 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p88r\" (UniqueName: \"kubernetes.io/projected/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-kube-api-access-2p88r\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709529 kubelet[2623]: I0904 17:53:45.709406 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8-typha-certs\") pod \"calico-typha-7cf89f49dc-pqj8f\" (UID: \"2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8\") " pod="calico-system/calico-typha-7cf89f49dc-pqj8f" Sep 4 17:53:45.709529 kubelet[2623]: I0904 17:53:45.709430 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-policysync\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709529 kubelet[2623]: I0904 17:53:45.709453 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-flexvol-driver-host\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709693 kubelet[2623]: I0904 17:53:45.709477 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flqbc\" (UniqueName: \"kubernetes.io/projected/2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8-kube-api-access-flqbc\") pod \"calico-typha-7cf89f49dc-pqj8f\" (UID: \"2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8\") " pod="calico-system/calico-typha-7cf89f49dc-pqj8f" Sep 4 17:53:45.709693 kubelet[2623]: I0904 17:53:45.709499 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-cni-net-dir\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.709693 kubelet[2623]: I0904 17:53:45.709522 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e4ba0508-e8ea-451a-83dc-7955cfbe4d70-cni-log-dir\") pod \"calico-node-88pjh\" (UID: \"e4ba0508-e8ea-451a-83dc-7955cfbe4d70\") " pod="calico-system/calico-node-88pjh" Sep 4 17:53:45.809543 kubelet[2623]: I0904 17:53:45.809464 2623 topology_manager.go:215] "Topology Admit Handler" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" podNamespace="calico-system" podName="csi-node-driver-w4vzr" Sep 4 17:53:45.810768 kubelet[2623]: E0904 17:53:45.810036 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:53:45.815409 kubelet[2623]: E0904 17:53:45.815328 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.815731 kubelet[2623]: W0904 17:53:45.815649 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.818371 kubelet[2623]: E0904 17:53:45.818091 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.822203 kubelet[2623]: E0904 17:53:45.822179 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.826170 kubelet[2623]: W0904 17:53:45.825878 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.827085 kubelet[2623]: E0904 17:53:45.827051 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.828279 kubelet[2623]: E0904 17:53:45.828204 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.830020 kubelet[2623]: W0904 17:53:45.829868 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.830116 kubelet[2623]: E0904 17:53:45.830053 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.833551 kubelet[2623]: E0904 17:53:45.833176 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.833551 kubelet[2623]: W0904 17:53:45.833201 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.835948 kubelet[2623]: E0904 17:53:45.834876 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.840274 kubelet[2623]: E0904 17:53:45.840145 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.840274 kubelet[2623]: W0904 17:53:45.840160 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.842657 kubelet[2623]: E0904 17:53:45.842600 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.847176 kubelet[2623]: E0904 17:53:45.847070 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.847532 kubelet[2623]: W0904 17:53:45.847370 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.847532 kubelet[2623]: E0904 17:53:45.847471 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.848612 kubelet[2623]: E0904 17:53:45.848051 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.848612 kubelet[2623]: W0904 17:53:45.848062 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.848929 kubelet[2623]: E0904 17:53:45.848835 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.849180 kubelet[2623]: E0904 17:53:45.849038 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.849180 kubelet[2623]: W0904 17:53:45.849049 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.849823 kubelet[2623]: E0904 17:53:45.849368 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.849823 kubelet[2623]: E0904 17:53:45.849692 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.849823 kubelet[2623]: W0904 17:53:45.849762 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.850579 kubelet[2623]: E0904 17:53:45.850266 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.850579 kubelet[2623]: E0904 17:53:45.850471 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.850579 kubelet[2623]: W0904 17:53:45.850481 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.851237 kubelet[2623]: E0904 17:53:45.850860 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.851611 kubelet[2623]: E0904 17:53:45.851600 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.851860 kubelet[2623]: W0904 17:53:45.851847 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.852218 kubelet[2623]: E0904 17:53:45.852208 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.852448 kubelet[2623]: W0904 17:53:45.852317 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.852814 kubelet[2623]: E0904 17:53:45.852780 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.852985 kubelet[2623]: W0904 17:53:45.852872 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.853469 kubelet[2623]: E0904 17:53:45.853024 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.853469 kubelet[2623]: E0904 17:53:45.853077 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.853469 kubelet[2623]: E0904 17:53:45.853099 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.853727 kubelet[2623]: E0904 17:53:45.853630 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.853727 kubelet[2623]: W0904 17:53:45.853642 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.854088 kubelet[2623]: E0904 17:53:45.854055 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.854935 kubelet[2623]: E0904 17:53:45.854565 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.854935 kubelet[2623]: W0904 17:53:45.854691 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.855146 kubelet[2623]: E0904 17:53:45.855132 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.855912 kubelet[2623]: E0904 17:53:45.855415 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.855912 kubelet[2623]: W0904 17:53:45.855426 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.855912 kubelet[2623]: E0904 17:53:45.855468 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.856218 kubelet[2623]: E0904 17:53:45.856117 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.856218 kubelet[2623]: W0904 17:53:45.856127 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.857512 kubelet[2623]: E0904 17:53:45.857487 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.857781 kubelet[2623]: E0904 17:53:45.857692 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.857781 kubelet[2623]: W0904 17:53:45.857704 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.859005 kubelet[2623]: E0904 17:53:45.858933 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.859240 kubelet[2623]: E0904 17:53:45.859186 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.859240 kubelet[2623]: W0904 17:53:45.859197 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.859315 kubelet[2623]: E0904 17:53:45.859250 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.860056 kubelet[2623]: E0904 17:53:45.859872 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.860056 kubelet[2623]: W0904 17:53:45.859889 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.860056 kubelet[2623]: E0904 17:53:45.860008 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.860401 kubelet[2623]: E0904 17:53:45.860325 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.860401 kubelet[2623]: W0904 17:53:45.860341 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.860401 kubelet[2623]: E0904 17:53:45.860392 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.861045 kubelet[2623]: E0904 17:53:45.860841 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.861045 kubelet[2623]: W0904 17:53:45.860853 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.861045 kubelet[2623]: E0904 17:53:45.860899 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.861971 kubelet[2623]: E0904 17:53:45.861863 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.861971 kubelet[2623]: W0904 17:53:45.861874 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.862338 kubelet[2623]: E0904 17:53:45.862159 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.862338 kubelet[2623]: E0904 17:53:45.862207 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.862338 kubelet[2623]: W0904 17:53:45.862258 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.862338 kubelet[2623]: E0904 17:53:45.862317 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.863775 kubelet[2623]: E0904 17:53:45.863761 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.864013 kubelet[2623]: W0904 17:53:45.863858 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.864328 kubelet[2623]: E0904 17:53:45.864157 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.864837 kubelet[2623]: E0904 17:53:45.864516 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.864837 kubelet[2623]: W0904 17:53:45.864527 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.864837 kubelet[2623]: E0904 17:53:45.864566 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.865121 kubelet[2623]: E0904 17:53:45.865069 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.865121 kubelet[2623]: W0904 17:53:45.865082 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.865121 kubelet[2623]: E0904 17:53:45.865097 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.866762 kubelet[2623]: E0904 17:53:45.866736 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.866762 kubelet[2623]: W0904 17:53:45.866753 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.866890 kubelet[2623]: E0904 17:53:45.866770 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.878163 kubelet[2623]: E0904 17:53:45.878132 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.878163 kubelet[2623]: W0904 17:53:45.878153 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.878311 kubelet[2623]: E0904 17:53:45.878175 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.891956 kubelet[2623]: E0904 17:53:45.891877 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.891956 kubelet[2623]: W0904 17:53:45.891896 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.891956 kubelet[2623]: E0904 17:53:45.891918 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.905322 kubelet[2623]: E0904 17:53:45.905291 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.905322 kubelet[2623]: W0904 17:53:45.905311 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.905322 kubelet[2623]: E0904 17:53:45.905332 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.905620 kubelet[2623]: E0904 17:53:45.905601 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.905620 kubelet[2623]: W0904 17:53:45.905615 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.905702 kubelet[2623]: E0904 17:53:45.905629 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.905869 kubelet[2623]: E0904 17:53:45.905847 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.905869 kubelet[2623]: W0904 17:53:45.905861 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.906071 kubelet[2623]: E0904 17:53:45.905874 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.906109 kubelet[2623]: E0904 17:53:45.906085 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.906109 kubelet[2623]: W0904 17:53:45.906094 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.906109 kubelet[2623]: E0904 17:53:45.906107 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.906313 kubelet[2623]: E0904 17:53:45.906287 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.906313 kubelet[2623]: W0904 17:53:45.906295 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.906313 kubelet[2623]: E0904 17:53:45.906307 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.906548 kubelet[2623]: E0904 17:53:45.906503 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.906548 kubelet[2623]: W0904 17:53:45.906513 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.906548 kubelet[2623]: E0904 17:53:45.906542 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.906762 kubelet[2623]: E0904 17:53:45.906748 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.906762 kubelet[2623]: W0904 17:53:45.906757 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.906846 kubelet[2623]: E0904 17:53:45.906773 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.907060 kubelet[2623]: E0904 17:53:45.907019 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.907060 kubelet[2623]: W0904 17:53:45.907034 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.907060 kubelet[2623]: E0904 17:53:45.907047 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.907343 kubelet[2623]: E0904 17:53:45.907319 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.907343 kubelet[2623]: W0904 17:53:45.907328 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.907410 kubelet[2623]: E0904 17:53:45.907365 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.907577 kubelet[2623]: E0904 17:53:45.907548 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.907577 kubelet[2623]: W0904 17:53:45.907562 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.907577 kubelet[2623]: E0904 17:53:45.907575 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.907777 kubelet[2623]: E0904 17:53:45.907759 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.907866 kubelet[2623]: W0904 17:53:45.907773 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.907866 kubelet[2623]: E0904 17:53:45.907849 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.908073 kubelet[2623]: E0904 17:53:45.908039 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.908073 kubelet[2623]: W0904 17:53:45.908053 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.908073 kubelet[2623]: E0904 17:53:45.908066 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.908902 kubelet[2623]: E0904 17:53:45.908884 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.908902 kubelet[2623]: W0904 17:53:45.908897 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.908998 kubelet[2623]: E0904 17:53:45.908911 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.909080 kubelet[2623]: E0904 17:53:45.909063 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.909080 kubelet[2623]: W0904 17:53:45.909075 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.909162 kubelet[2623]: E0904 17:53:45.909087 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.909235 kubelet[2623]: E0904 17:53:45.909218 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.909235 kubelet[2623]: W0904 17:53:45.909230 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.909325 kubelet[2623]: E0904 17:53:45.909243 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.909398 kubelet[2623]: E0904 17:53:45.909381 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.909398 kubelet[2623]: W0904 17:53:45.909394 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.909481 kubelet[2623]: E0904 17:53:45.909407 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.909568 kubelet[2623]: E0904 17:53:45.909551 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.909568 kubelet[2623]: W0904 17:53:45.909564 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.909665 kubelet[2623]: E0904 17:53:45.909575 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.909894 kubelet[2623]: E0904 17:53:45.909863 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.909894 kubelet[2623]: W0904 17:53:45.909876 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.909894 kubelet[2623]: E0904 17:53:45.909889 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.910180 kubelet[2623]: E0904 17:53:45.910149 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.910180 kubelet[2623]: W0904 17:53:45.910164 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.910180 kubelet[2623]: E0904 17:53:45.910180 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.910653 kubelet[2623]: E0904 17:53:45.910389 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.910653 kubelet[2623]: W0904 17:53:45.910399 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.910653 kubelet[2623]: E0904 17:53:45.910537 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.913674 kubelet[2623]: E0904 17:53:45.913640 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.913674 kubelet[2623]: W0904 17:53:45.913659 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.913674 kubelet[2623]: E0904 17:53:45.913679 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.913674 kubelet[2623]: I0904 17:53:45.913713 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5658969e-2b8a-4734-8694-fff3696c8a14-socket-dir\") pod \"csi-node-driver-w4vzr\" (UID: \"5658969e-2b8a-4734-8694-fff3696c8a14\") " pod="calico-system/csi-node-driver-w4vzr" Sep 4 17:53:45.914085 kubelet[2623]: E0904 17:53:45.913919 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.914085 kubelet[2623]: W0904 17:53:45.913927 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.914085 kubelet[2623]: E0904 17:53:45.913948 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.914085 kubelet[2623]: I0904 17:53:45.914000 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/5658969e-2b8a-4734-8694-fff3696c8a14-varrun\") pod \"csi-node-driver-w4vzr\" (UID: \"5658969e-2b8a-4734-8694-fff3696c8a14\") " pod="calico-system/csi-node-driver-w4vzr" Sep 4 17:53:45.914181 kubelet[2623]: E0904 17:53:45.914106 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.914181 kubelet[2623]: W0904 17:53:45.914115 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.914181 kubelet[2623]: E0904 17:53:45.914127 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.914250 kubelet[2623]: E0904 17:53:45.914240 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.914250 kubelet[2623]: W0904 17:53:45.914248 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.914299 kubelet[2623]: E0904 17:53:45.914260 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.914780 kubelet[2623]: E0904 17:53:45.914629 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.914780 kubelet[2623]: W0904 17:53:45.914642 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.914780 kubelet[2623]: E0904 17:53:45.914669 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.914780 kubelet[2623]: I0904 17:53:45.914693 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5658969e-2b8a-4734-8694-fff3696c8a14-registration-dir\") pod \"csi-node-driver-w4vzr\" (UID: \"5658969e-2b8a-4734-8694-fff3696c8a14\") " pod="calico-system/csi-node-driver-w4vzr" Sep 4 17:53:45.915178 kubelet[2623]: E0904 17:53:45.914895 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.915178 kubelet[2623]: W0904 17:53:45.914905 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.915178 kubelet[2623]: E0904 17:53:45.914939 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.915178 kubelet[2623]: I0904 17:53:45.914962 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jrnz\" (UniqueName: \"kubernetes.io/projected/5658969e-2b8a-4734-8694-fff3696c8a14-kube-api-access-4jrnz\") pod \"csi-node-driver-w4vzr\" (UID: \"5658969e-2b8a-4734-8694-fff3696c8a14\") " pod="calico-system/csi-node-driver-w4vzr" Sep 4 17:53:45.915877 kubelet[2623]: E0904 17:53:45.915863 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.915960 kubelet[2623]: W0904 17:53:45.915948 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.916060 kubelet[2623]: E0904 17:53:45.916048 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.916206 kubelet[2623]: I0904 17:53:45.916135 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5658969e-2b8a-4734-8694-fff3696c8a14-kubelet-dir\") pod \"csi-node-driver-w4vzr\" (UID: \"5658969e-2b8a-4734-8694-fff3696c8a14\") " pod="calico-system/csi-node-driver-w4vzr" Sep 4 17:53:45.916418 kubelet[2623]: E0904 17:53:45.916400 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.916418 kubelet[2623]: W0904 17:53:45.916413 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.916554 kubelet[2623]: E0904 17:53:45.916430 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.917993 kubelet[2623]: E0904 17:53:45.917974 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.917993 kubelet[2623]: W0904 17:53:45.917988 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.918167 kubelet[2623]: E0904 17:53:45.918098 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.918230 kubelet[2623]: E0904 17:53:45.918211 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.918230 kubelet[2623]: W0904 17:53:45.918224 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.918341 kubelet[2623]: E0904 17:53:45.918327 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.918517 kubelet[2623]: E0904 17:53:45.918498 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.918517 kubelet[2623]: W0904 17:53:45.918511 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.918597 kubelet[2623]: E0904 17:53:45.918535 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.918766 kubelet[2623]: E0904 17:53:45.918746 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.918766 kubelet[2623]: W0904 17:53:45.918760 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.918939 kubelet[2623]: E0904 17:53:45.918918 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.919025 kubelet[2623]: E0904 17:53:45.919008 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.919025 kubelet[2623]: W0904 17:53:45.919021 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.919089 kubelet[2623]: E0904 17:53:45.919055 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.919483 kubelet[2623]: E0904 17:53:45.919461 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.919483 kubelet[2623]: W0904 17:53:45.919475 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.919557 kubelet[2623]: E0904 17:53:45.919488 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.919667 kubelet[2623]: E0904 17:53:45.919647 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:45.919667 kubelet[2623]: W0904 17:53:45.919660 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:45.919728 kubelet[2623]: E0904 17:53:45.919671 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:45.921397 containerd[1449]: time="2024-09-04T17:53:45.921358349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cf89f49dc-pqj8f,Uid:2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8,Namespace:calico-system,Attempt:0,}" Sep 4 17:53:45.967772 containerd[1449]: time="2024-09-04T17:53:45.965774994Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:53:45.967772 containerd[1449]: time="2024-09-04T17:53:45.965855595Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:53:45.967772 containerd[1449]: time="2024-09-04T17:53:45.965875653Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:45.967772 containerd[1449]: time="2024-09-04T17:53:45.965952376Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:45.989043 systemd[1]: Started cri-containerd-c855f8431987af81e5b88032a027dbb7e5d0cb4b5c9c716602911ce9eac6c7ba.scope - libcontainer container c855f8431987af81e5b88032a027dbb7e5d0cb4b5c9c716602911ce9eac6c7ba. Sep 4 17:53:46.004152 containerd[1449]: time="2024-09-04T17:53:46.003663607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-88pjh,Uid:e4ba0508-e8ea-451a-83dc-7955cfbe4d70,Namespace:calico-system,Attempt:0,}" Sep 4 17:53:46.017737 kubelet[2623]: E0904 17:53:46.017702 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.018249 kubelet[2623]: W0904 17:53:46.018046 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.018249 kubelet[2623]: E0904 17:53:46.018076 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.021568 kubelet[2623]: E0904 17:53:46.020384 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.021568 kubelet[2623]: W0904 17:53:46.020399 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.021568 kubelet[2623]: E0904 17:53:46.020422 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.021568 kubelet[2623]: E0904 17:53:46.021097 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.021568 kubelet[2623]: W0904 17:53:46.021107 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.021568 kubelet[2623]: E0904 17:53:46.021121 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.021568 kubelet[2623]: E0904 17:53:46.021320 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.021568 kubelet[2623]: W0904 17:53:46.021331 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.021568 kubelet[2623]: E0904 17:53:46.021345 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.021568 kubelet[2623]: E0904 17:53:46.021534 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.021908 kubelet[2623]: W0904 17:53:46.021542 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.022317 kubelet[2623]: E0904 17:53:46.022181 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.022317 kubelet[2623]: E0904 17:53:46.022278 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.022317 kubelet[2623]: W0904 17:53:46.022286 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.022829 kubelet[2623]: E0904 17:53:46.022774 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.022829 kubelet[2623]: W0904 17:53:46.022786 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.023412 kubelet[2623]: E0904 17:53:46.023270 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.023412 kubelet[2623]: W0904 17:53:46.023282 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.023412 kubelet[2623]: E0904 17:53:46.023296 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.023741 kubelet[2623]: E0904 17:53:46.023633 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.023741 kubelet[2623]: W0904 17:53:46.023644 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.023741 kubelet[2623]: E0904 17:53:46.023673 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.024286 kubelet[2623]: E0904 17:53:46.024064 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.024286 kubelet[2623]: W0904 17:53:46.024075 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.024286 kubelet[2623]: E0904 17:53:46.024089 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.024463 kubelet[2623]: E0904 17:53:46.024321 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.024463 kubelet[2623]: E0904 17:53:46.024368 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.024663 kubelet[2623]: E0904 17:53:46.024566 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.024663 kubelet[2623]: W0904 17:53:46.024577 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.024663 kubelet[2623]: E0904 17:53:46.024605 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.025086 kubelet[2623]: E0904 17:53:46.024949 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.025086 kubelet[2623]: W0904 17:53:46.024968 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.025086 kubelet[2623]: E0904 17:53:46.024986 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.025446 kubelet[2623]: E0904 17:53:46.025322 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.025446 kubelet[2623]: W0904 17:53:46.025332 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.025644 kubelet[2623]: E0904 17:53:46.025600 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.025644 kubelet[2623]: E0904 17:53:46.025607 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.025644 kubelet[2623]: W0904 17:53:46.025611 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.025857 kubelet[2623]: E0904 17:53:46.025731 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.026274 kubelet[2623]: E0904 17:53:46.026209 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.026274 kubelet[2623]: W0904 17:53:46.026247 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.026409 kubelet[2623]: E0904 17:53:46.026315 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.026711 kubelet[2623]: E0904 17:53:46.026560 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.026711 kubelet[2623]: W0904 17:53:46.026570 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.026711 kubelet[2623]: E0904 17:53:46.026646 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.027093 kubelet[2623]: E0904 17:53:46.026854 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.027093 kubelet[2623]: W0904 17:53:46.026864 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.027093 kubelet[2623]: E0904 17:53:46.026908 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.027411 kubelet[2623]: E0904 17:53:46.027127 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.027411 kubelet[2623]: W0904 17:53:46.027136 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.027411 kubelet[2623]: E0904 17:53:46.027156 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.028287 kubelet[2623]: E0904 17:53:46.028129 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.028287 kubelet[2623]: W0904 17:53:46.028159 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.028287 kubelet[2623]: E0904 17:53:46.028186 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.039059 kubelet[2623]: E0904 17:53:46.038592 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.039059 kubelet[2623]: W0904 17:53:46.038617 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.039059 kubelet[2623]: E0904 17:53:46.038768 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.039059 kubelet[2623]: W0904 17:53:46.038776 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.039059 kubelet[2623]: E0904 17:53:46.038917 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.039059 kubelet[2623]: W0904 17:53:46.038925 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.039059 kubelet[2623]: E0904 17:53:46.038940 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.039059 kubelet[2623]: E0904 17:53:46.038998 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.040071 kubelet[2623]: E0904 17:53:46.040058 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.040152 kubelet[2623]: W0904 17:53:46.040141 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.040306 kubelet[2623]: E0904 17:53:46.040206 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.040306 kubelet[2623]: E0904 17:53:46.040236 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.040817 kubelet[2623]: E0904 17:53:46.040631 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.040881 kubelet[2623]: W0904 17:53:46.040869 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.040935 kubelet[2623]: E0904 17:53:46.040927 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.041855 kubelet[2623]: E0904 17:53:46.041833 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.041951 kubelet[2623]: W0904 17:53:46.041940 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.042096 kubelet[2623]: E0904 17:53:46.042051 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.062167 containerd[1449]: time="2024-09-04T17:53:46.057961062Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:53:46.062167 containerd[1449]: time="2024-09-04T17:53:46.058043867Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:53:46.062167 containerd[1449]: time="2024-09-04T17:53:46.058075947Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:46.062167 containerd[1449]: time="2024-09-04T17:53:46.060070077Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:53:46.063454 kubelet[2623]: E0904 17:53:46.063439 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:46.063588 kubelet[2623]: W0904 17:53:46.063534 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:46.063588 kubelet[2623]: E0904 17:53:46.063559 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:46.095044 systemd[1]: Started cri-containerd-d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0.scope - libcontainer container d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0. Sep 4 17:53:46.106982 containerd[1449]: time="2024-09-04T17:53:46.106130945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cf89f49dc-pqj8f,Uid:2b729b5f-ebd2-4eea-be0d-5f0bd3f4cbe8,Namespace:calico-system,Attempt:0,} returns sandbox id \"c855f8431987af81e5b88032a027dbb7e5d0cb4b5c9c716602911ce9eac6c7ba\"" Sep 4 17:53:46.121485 containerd[1449]: time="2024-09-04T17:53:46.121442429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.28.1\"" Sep 4 17:53:46.146050 containerd[1449]: time="2024-09-04T17:53:46.145946082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-88pjh,Uid:e4ba0508-e8ea-451a-83dc-7955cfbe4d70,Namespace:calico-system,Attempt:0,} returns sandbox id \"d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0\"" Sep 4 17:53:47.485475 kubelet[2623]: E0904 17:53:47.485423 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:53:49.479869 kubelet[2623]: E0904 17:53:49.479621 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:53:49.732568 containerd[1449]: time="2024-09-04T17:53:49.731659183Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:49.733906 containerd[1449]: time="2024-09-04T17:53:49.733752389Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.28.1: active requests=0, bytes read=29471335" Sep 4 17:53:49.737009 containerd[1449]: time="2024-09-04T17:53:49.736950616Z" level=info msg="ImageCreate event name:\"sha256:a19ab150adede78dd36481226e260735eb3b811481c6765aec79e8da6ae78b7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:49.749825 containerd[1449]: time="2024-09-04T17:53:49.749642708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d97114d8e1e5186f1180fc8ef5f1309e0a8bf97efce35e0a0223d057d78d95fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:49.751883 containerd[1449]: time="2024-09-04T17:53:49.751647979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.28.1\" with image id \"sha256:a19ab150adede78dd36481226e260735eb3b811481c6765aec79e8da6ae78b7f\", repo tag \"ghcr.io/flatcar/calico/typha:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d97114d8e1e5186f1180fc8ef5f1309e0a8bf97efce35e0a0223d057d78d95fb\", size \"30963728\" in 3.630161337s" Sep 4 17:53:49.751883 containerd[1449]: time="2024-09-04T17:53:49.751681482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.28.1\" returns image reference \"sha256:a19ab150adede78dd36481226e260735eb3b811481c6765aec79e8da6ae78b7f\"" Sep 4 17:53:49.753219 containerd[1449]: time="2024-09-04T17:53:49.753058224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\"" Sep 4 17:53:49.782092 containerd[1449]: time="2024-09-04T17:53:49.779775167Z" level=info msg="CreateContainer within sandbox \"c855f8431987af81e5b88032a027dbb7e5d0cb4b5c9c716602911ce9eac6c7ba\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 17:53:49.809206 containerd[1449]: time="2024-09-04T17:53:49.809144394Z" level=info msg="CreateContainer within sandbox \"c855f8431987af81e5b88032a027dbb7e5d0cb4b5c9c716602911ce9eac6c7ba\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"77f645d3a22c296c7155ef4cb01a2823363923cc68d6b029fd74d50083598463\"" Sep 4 17:53:49.810303 containerd[1449]: time="2024-09-04T17:53:49.810272039Z" level=info msg="StartContainer for \"77f645d3a22c296c7155ef4cb01a2823363923cc68d6b029fd74d50083598463\"" Sep 4 17:53:49.881015 systemd[1]: Started cri-containerd-77f645d3a22c296c7155ef4cb01a2823363923cc68d6b029fd74d50083598463.scope - libcontainer container 77f645d3a22c296c7155ef4cb01a2823363923cc68d6b029fd74d50083598463. Sep 4 17:53:50.019346 containerd[1449]: time="2024-09-04T17:53:50.019149788Z" level=info msg="StartContainer for \"77f645d3a22c296c7155ef4cb01a2823363923cc68d6b029fd74d50083598463\" returns successfully" Sep 4 17:53:50.650500 kubelet[2623]: E0904 17:53:50.650440 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.650500 kubelet[2623]: W0904 17:53:50.650465 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.650500 kubelet[2623]: E0904 17:53:50.650502 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.652940 kubelet[2623]: E0904 17:53:50.650773 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.652940 kubelet[2623]: W0904 17:53:50.650783 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.652940 kubelet[2623]: E0904 17:53:50.650814 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.652940 kubelet[2623]: E0904 17:53:50.651002 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.652940 kubelet[2623]: W0904 17:53:50.651011 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.652940 kubelet[2623]: E0904 17:53:50.651026 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.652940 kubelet[2623]: E0904 17:53:50.651316 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.652940 kubelet[2623]: W0904 17:53:50.651326 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.652940 kubelet[2623]: E0904 17:53:50.651339 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.652940 kubelet[2623]: E0904 17:53:50.651531 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.653592 kubelet[2623]: W0904 17:53:50.651540 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.653592 kubelet[2623]: E0904 17:53:50.651553 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.653592 kubelet[2623]: E0904 17:53:50.651925 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.653592 kubelet[2623]: W0904 17:53:50.651935 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.653592 kubelet[2623]: E0904 17:53:50.651949 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.653592 kubelet[2623]: E0904 17:53:50.652073 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.653592 kubelet[2623]: W0904 17:53:50.652081 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.653592 kubelet[2623]: E0904 17:53:50.652093 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.653592 kubelet[2623]: E0904 17:53:50.652203 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.653592 kubelet[2623]: W0904 17:53:50.652211 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.654424 kubelet[2623]: E0904 17:53:50.652222 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.654424 kubelet[2623]: E0904 17:53:50.652347 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.654424 kubelet[2623]: W0904 17:53:50.652354 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.654424 kubelet[2623]: E0904 17:53:50.652365 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.654424 kubelet[2623]: E0904 17:53:50.652476 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.654424 kubelet[2623]: W0904 17:53:50.652484 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.654424 kubelet[2623]: E0904 17:53:50.652495 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.654424 kubelet[2623]: E0904 17:53:50.652647 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.654424 kubelet[2623]: W0904 17:53:50.652658 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.654424 kubelet[2623]: E0904 17:53:50.652670 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.655329 kubelet[2623]: E0904 17:53:50.652824 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.655329 kubelet[2623]: W0904 17:53:50.652832 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.655329 kubelet[2623]: E0904 17:53:50.652852 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.655329 kubelet[2623]: E0904 17:53:50.653030 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.655329 kubelet[2623]: W0904 17:53:50.653039 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.655329 kubelet[2623]: E0904 17:53:50.653051 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.655329 kubelet[2623]: E0904 17:53:50.653218 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.655329 kubelet[2623]: W0904 17:53:50.653227 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.655329 kubelet[2623]: E0904 17:53:50.653240 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.655329 kubelet[2623]: E0904 17:53:50.653413 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.656461 kubelet[2623]: W0904 17:53:50.653421 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.656461 kubelet[2623]: E0904 17:53:50.653435 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.656461 kubelet[2623]: E0904 17:53:50.655019 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.656461 kubelet[2623]: W0904 17:53:50.655030 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.656461 kubelet[2623]: E0904 17:53:50.655044 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.656461 kubelet[2623]: E0904 17:53:50.656103 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.656461 kubelet[2623]: W0904 17:53:50.656115 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.656461 kubelet[2623]: E0904 17:53:50.656130 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.656461 kubelet[2623]: E0904 17:53:50.656418 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.656461 kubelet[2623]: W0904 17:53:50.656427 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.657445 kubelet[2623]: E0904 17:53:50.656455 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.657445 kubelet[2623]: E0904 17:53:50.656741 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.657445 kubelet[2623]: W0904 17:53:50.656752 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.657445 kubelet[2623]: E0904 17:53:50.656765 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.657445 kubelet[2623]: E0904 17:53:50.657051 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.657445 kubelet[2623]: W0904 17:53:50.657059 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.657445 kubelet[2623]: E0904 17:53:50.657070 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.657955 kubelet[2623]: E0904 17:53:50.657916 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.657955 kubelet[2623]: W0904 17:53:50.657933 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.657955 kubelet[2623]: E0904 17:53:50.657947 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.658340 kubelet[2623]: E0904 17:53:50.658213 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.658340 kubelet[2623]: W0904 17:53:50.658222 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.658340 kubelet[2623]: E0904 17:53:50.658234 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.658741 kubelet[2623]: E0904 17:53:50.658659 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.658741 kubelet[2623]: W0904 17:53:50.658669 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.658741 kubelet[2623]: E0904 17:53:50.658682 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.659354 kubelet[2623]: E0904 17:53:50.659014 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.659354 kubelet[2623]: W0904 17:53:50.659025 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.659354 kubelet[2623]: E0904 17:53:50.659037 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.659354 kubelet[2623]: E0904 17:53:50.659281 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.659354 kubelet[2623]: W0904 17:53:50.659290 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.659354 kubelet[2623]: E0904 17:53:50.659302 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.660066 kubelet[2623]: E0904 17:53:50.660017 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.660066 kubelet[2623]: W0904 17:53:50.660037 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.660066 kubelet[2623]: E0904 17:53:50.660051 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.660521 kubelet[2623]: E0904 17:53:50.660474 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.660521 kubelet[2623]: W0904 17:53:50.660491 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.660521 kubelet[2623]: E0904 17:53:50.660503 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.661098 kubelet[2623]: E0904 17:53:50.661067 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.661098 kubelet[2623]: W0904 17:53:50.661084 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.661098 kubelet[2623]: E0904 17:53:50.661096 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.661533 kubelet[2623]: E0904 17:53:50.661503 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.661533 kubelet[2623]: W0904 17:53:50.661519 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.661533 kubelet[2623]: E0904 17:53:50.661532 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.662057 kubelet[2623]: E0904 17:53:50.662026 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.662057 kubelet[2623]: W0904 17:53:50.662043 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.662057 kubelet[2623]: E0904 17:53:50.662055 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.662501 kubelet[2623]: E0904 17:53:50.662470 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.662501 kubelet[2623]: W0904 17:53:50.662488 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.662501 kubelet[2623]: E0904 17:53:50.662501 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.662993 kubelet[2623]: E0904 17:53:50.662962 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.662993 kubelet[2623]: W0904 17:53:50.662981 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.662993 kubelet[2623]: E0904 17:53:50.662993 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.663437 kubelet[2623]: E0904 17:53:50.663407 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:50.663437 kubelet[2623]: W0904 17:53:50.663424 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:50.663437 kubelet[2623]: E0904 17:53:50.663445 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:50.747468 kubelet[2623]: I0904 17:53:50.747313 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-7cf89f49dc-pqj8f" podStartSLOduration=2.114094945 podCreationTimestamp="2024-09-04 17:53:45 +0000 UTC" firstStartedPulling="2024-09-04 17:53:46.119156603 +0000 UTC m=+22.881802658" lastFinishedPulling="2024-09-04 17:53:49.752231734 +0000 UTC m=+26.514877779" observedRunningTime="2024-09-04 17:53:50.74603634 +0000 UTC m=+27.508682435" watchObservedRunningTime="2024-09-04 17:53:50.747170066 +0000 UTC m=+27.509816232" Sep 4 17:53:51.478563 kubelet[2623]: E0904 17:53:51.478316 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:53:51.615813 kubelet[2623]: I0904 17:53:51.613602 2623 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 17:53:51.664455 kubelet[2623]: E0904 17:53:51.664415 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.665185 kubelet[2623]: W0904 17:53:51.665121 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.665371 kubelet[2623]: E0904 17:53:51.665346 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.665879 kubelet[2623]: E0904 17:53:51.665857 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.666012 kubelet[2623]: W0904 17:53:51.665990 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.666219 kubelet[2623]: E0904 17:53:51.666199 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.667226 kubelet[2623]: E0904 17:53:51.666933 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.667226 kubelet[2623]: W0904 17:53:51.666957 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.667226 kubelet[2623]: E0904 17:53:51.666989 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.668037 kubelet[2623]: E0904 17:53:51.667987 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.668489 kubelet[2623]: W0904 17:53:51.668284 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.668489 kubelet[2623]: E0904 17:53:51.668319 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.669355 kubelet[2623]: E0904 17:53:51.669194 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.669355 kubelet[2623]: W0904 17:53:51.669216 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.669355 kubelet[2623]: E0904 17:53:51.669247 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.671455 kubelet[2623]: E0904 17:53:51.671150 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.671455 kubelet[2623]: W0904 17:53:51.671177 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.671455 kubelet[2623]: E0904 17:53:51.671208 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.672210 kubelet[2623]: E0904 17:53:51.672016 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.672210 kubelet[2623]: W0904 17:53:51.672040 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.672210 kubelet[2623]: E0904 17:53:51.672067 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.672540 kubelet[2623]: E0904 17:53:51.672518 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.672670 kubelet[2623]: W0904 17:53:51.672649 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.673080 kubelet[2623]: E0904 17:53:51.672792 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.673331 kubelet[2623]: E0904 17:53:51.673308 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.673469 kubelet[2623]: W0904 17:53:51.673448 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.673593 kubelet[2623]: E0904 17:53:51.673577 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.674251 kubelet[2623]: E0904 17:53:51.674045 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.674251 kubelet[2623]: W0904 17:53:51.674068 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.674251 kubelet[2623]: E0904 17:53:51.674095 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.674618 kubelet[2623]: E0904 17:53:51.674593 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.674755 kubelet[2623]: W0904 17:53:51.674734 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.675125 kubelet[2623]: E0904 17:53:51.674928 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.675386 kubelet[2623]: E0904 17:53:51.675361 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.675521 kubelet[2623]: W0904 17:53:51.675501 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.675658 kubelet[2623]: E0904 17:53:51.675640 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.676288 kubelet[2623]: E0904 17:53:51.676100 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.676288 kubelet[2623]: W0904 17:53:51.676123 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.676288 kubelet[2623]: E0904 17:53:51.676148 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.676598 kubelet[2623]: E0904 17:53:51.676575 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.676724 kubelet[2623]: W0904 17:53:51.676704 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.677070 kubelet[2623]: E0904 17:53:51.676905 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.677283 kubelet[2623]: E0904 17:53:51.677261 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.677409 kubelet[2623]: W0904 17:53:51.677389 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.678504 kubelet[2623]: E0904 17:53:51.677551 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.679114 kubelet[2623]: E0904 17:53:51.678777 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.679114 kubelet[2623]: W0904 17:53:51.678844 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.679114 kubelet[2623]: E0904 17:53:51.678875 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.679468 kubelet[2623]: E0904 17:53:51.679443 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.679588 kubelet[2623]: W0904 17:53:51.679569 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.679847 kubelet[2623]: E0904 17:53:51.679713 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.680556 kubelet[2623]: E0904 17:53:51.680507 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.680556 kubelet[2623]: W0904 17:53:51.680534 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.680556 kubelet[2623]: E0904 17:53:51.680562 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.681173 kubelet[2623]: E0904 17:53:51.681143 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.681173 kubelet[2623]: W0904 17:53:51.681160 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.681173 kubelet[2623]: E0904 17:53:51.681181 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.681603 kubelet[2623]: E0904 17:53:51.681463 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.681603 kubelet[2623]: W0904 17:53:51.681473 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.681603 kubelet[2623]: E0904 17:53:51.681486 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.681787 kubelet[2623]: E0904 17:53:51.681645 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.681787 kubelet[2623]: W0904 17:53:51.681690 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.681787 kubelet[2623]: E0904 17:53:51.681705 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.684455 kubelet[2623]: E0904 17:53:51.682873 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.684455 kubelet[2623]: W0904 17:53:51.682890 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.684455 kubelet[2623]: E0904 17:53:51.683341 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.684455 kubelet[2623]: W0904 17:53:51.683350 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.684455 kubelet[2623]: E0904 17:53:51.683554 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.684455 kubelet[2623]: W0904 17:53:51.683563 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.684455 kubelet[2623]: E0904 17:53:51.683577 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.684455 kubelet[2623]: E0904 17:53:51.683784 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.684455 kubelet[2623]: W0904 17:53:51.683837 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.684455 kubelet[2623]: E0904 17:53:51.683852 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.684455 kubelet[2623]: E0904 17:53:51.684036 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.685141 kubelet[2623]: W0904 17:53:51.684045 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.685141 kubelet[2623]: E0904 17:53:51.684058 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.685141 kubelet[2623]: E0904 17:53:51.684253 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.685141 kubelet[2623]: E0904 17:53:51.684304 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.685567 kubelet[2623]: E0904 17:53:51.685546 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.685817 kubelet[2623]: W0904 17:53:51.685770 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.685963 kubelet[2623]: E0904 17:53:51.685946 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.686190 kubelet[2623]: E0904 17:53:51.686151 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.686190 kubelet[2623]: W0904 17:53:51.686169 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.686190 kubelet[2623]: E0904 17:53:51.686185 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.686773 kubelet[2623]: E0904 17:53:51.686748 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.686773 kubelet[2623]: W0904 17:53:51.686765 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.686773 kubelet[2623]: E0904 17:53:51.686780 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.687233 kubelet[2623]: E0904 17:53:51.686987 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.687233 kubelet[2623]: W0904 17:53:51.686997 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.687233 kubelet[2623]: E0904 17:53:51.687018 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.687233 kubelet[2623]: E0904 17:53:51.687166 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.687233 kubelet[2623]: W0904 17:53:51.687175 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.687233 kubelet[2623]: E0904 17:53:51.687189 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.688107 kubelet[2623]: E0904 17:53:51.688091 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.688107 kubelet[2623]: W0904 17:53:51.688105 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.688238 kubelet[2623]: E0904 17:53:51.688124 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.688630 kubelet[2623]: E0904 17:53:51.688612 2623 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:53:51.688630 kubelet[2623]: W0904 17:53:51.688626 2623 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:53:51.688719 kubelet[2623]: E0904 17:53:51.688640 2623 plugins.go:723] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:53:51.826783 containerd[1449]: time="2024-09-04T17:53:51.826101556Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:51.830173 containerd[1449]: time="2024-09-04T17:53:51.829588977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1: active requests=0, bytes read=5141007" Sep 4 17:53:51.831146 containerd[1449]: time="2024-09-04T17:53:51.830902471Z" level=info msg="ImageCreate event name:\"sha256:00564b1c843430f804fda219f98769c25b538adebc11504477d5ee331fd8f85b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:51.833522 containerd[1449]: time="2024-09-04T17:53:51.833485434Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:7938ad0cb2b49a32937962cc40dd826ad5858999c603bdf5fbf2092a4d50cf01\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:51.834330 containerd[1449]: time="2024-09-04T17:53:51.834302827Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\" with image id \"sha256:00564b1c843430f804fda219f98769c25b538adebc11504477d5ee331fd8f85b\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:7938ad0cb2b49a32937962cc40dd826ad5858999c603bdf5fbf2092a4d50cf01\", size \"6633368\" in 2.081208776s" Sep 4 17:53:51.834519 containerd[1449]: time="2024-09-04T17:53:51.834433913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\" returns image reference \"sha256:00564b1c843430f804fda219f98769c25b538adebc11504477d5ee331fd8f85b\"" Sep 4 17:53:51.837622 containerd[1449]: time="2024-09-04T17:53:51.837586806Z" level=info msg="CreateContainer within sandbox \"d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 17:53:51.857660 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount69544476.mount: Deactivated successfully. Sep 4 17:53:51.861817 containerd[1449]: time="2024-09-04T17:53:51.861718511Z" level=info msg="CreateContainer within sandbox \"d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b\"" Sep 4 17:53:51.863720 containerd[1449]: time="2024-09-04T17:53:51.862769172Z" level=info msg="StartContainer for \"ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b\"" Sep 4 17:53:51.903970 systemd[1]: Started cri-containerd-ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b.scope - libcontainer container ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b. Sep 4 17:53:51.941097 containerd[1449]: time="2024-09-04T17:53:51.941045158Z" level=info msg="StartContainer for \"ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b\" returns successfully" Sep 4 17:53:51.952508 systemd[1]: cri-containerd-ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b.scope: Deactivated successfully. Sep 4 17:53:51.981345 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b-rootfs.mount: Deactivated successfully. Sep 4 17:53:52.424385 containerd[1449]: time="2024-09-04T17:53:52.424043731Z" level=info msg="shim disconnected" id=ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b namespace=k8s.io Sep 4 17:53:52.424385 containerd[1449]: time="2024-09-04T17:53:52.424170859Z" level=warning msg="cleaning up after shim disconnected" id=ef7340ddd40d2fa6430c30364929637ff2c07d15b98091695b2e4330257d785b namespace=k8s.io Sep 4 17:53:52.424385 containerd[1449]: time="2024-09-04T17:53:52.424208650Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 4 17:53:52.622579 containerd[1449]: time="2024-09-04T17:53:52.622458152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.28.1\"" Sep 4 17:53:53.479895 kubelet[2623]: E0904 17:53:53.478957 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:53:55.482830 kubelet[2623]: E0904 17:53:55.481519 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:53:56.482999 kubelet[2623]: I0904 17:53:56.482901 2623 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 17:53:57.479998 kubelet[2623]: E0904 17:53:57.479588 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:53:58.569671 containerd[1449]: time="2024-09-04T17:53:58.568737081Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:58.570384 containerd[1449]: time="2024-09-04T17:53:58.570348693Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.28.1: active requests=0, bytes read=93083736" Sep 4 17:53:58.571536 containerd[1449]: time="2024-09-04T17:53:58.571511561Z" level=info msg="ImageCreate event name:\"sha256:f6d76a1259a8c22fd1c603577ee5bb8109bc40f2b3d0536d39160a027ffe9bab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:58.575183 containerd[1449]: time="2024-09-04T17:53:58.575154206Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:1cf32b2159ec9f938e747b82b9b7c74e26e17eb220e002a6a1bd6b5b1266e1fa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:53:58.576588 containerd[1449]: time="2024-09-04T17:53:58.576522818Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.28.1\" with image id \"sha256:f6d76a1259a8c22fd1c603577ee5bb8109bc40f2b3d0536d39160a027ffe9bab\", repo tag \"ghcr.io/flatcar/calico/cni:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:1cf32b2159ec9f938e747b82b9b7c74e26e17eb220e002a6a1bd6b5b1266e1fa\", size \"94576137\" in 5.953973708s" Sep 4 17:53:58.576653 containerd[1449]: time="2024-09-04T17:53:58.576591640Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.28.1\" returns image reference \"sha256:f6d76a1259a8c22fd1c603577ee5bb8109bc40f2b3d0536d39160a027ffe9bab\"" Sep 4 17:53:58.581022 containerd[1449]: time="2024-09-04T17:53:58.580967173Z" level=info msg="CreateContainer within sandbox \"d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 17:53:58.640457 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1374164630.mount: Deactivated successfully. Sep 4 17:53:58.656494 containerd[1449]: time="2024-09-04T17:53:58.656240549Z" level=info msg="CreateContainer within sandbox \"d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac\"" Sep 4 17:53:58.659911 containerd[1449]: time="2024-09-04T17:53:58.659631349Z" level=info msg="StartContainer for \"9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac\"" Sep 4 17:53:58.790012 systemd[1]: Started cri-containerd-9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac.scope - libcontainer container 9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac. Sep 4 17:53:58.978222 containerd[1449]: time="2024-09-04T17:53:58.978076048Z" level=info msg="StartContainer for \"9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac\" returns successfully" Sep 4 17:53:59.479049 kubelet[2623]: E0904 17:53:59.478265 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:54:00.803642 systemd[1]: cri-containerd-9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac.scope: Deactivated successfully. Sep 4 17:54:00.864683 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac-rootfs.mount: Deactivated successfully. Sep 4 17:54:00.960599 kubelet[2623]: I0904 17:54:00.960532 2623 kubelet_node_status.go:493] "Fast updating node status as it just became ready" Sep 4 17:54:01.151414 containerd[1449]: time="2024-09-04T17:54:01.151013700Z" level=info msg="shim disconnected" id=9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac namespace=k8s.io Sep 4 17:54:01.151414 containerd[1449]: time="2024-09-04T17:54:01.151260909Z" level=warning msg="cleaning up after shim disconnected" id=9235a1c306fe8d68d96f53825836713ffff7bcc347038cf2d649b07b12ddc8ac namespace=k8s.io Sep 4 17:54:01.151414 containerd[1449]: time="2024-09-04T17:54:01.151329040Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 4 17:54:01.191949 kubelet[2623]: I0904 17:54:01.191871 2623 topology_manager.go:215] "Topology Admit Handler" podUID="8ad7d24b-d7a2-481f-abff-0aa847f03938" podNamespace="kube-system" podName="coredns-5dd5756b68-4dlsd" Sep 4 17:54:01.196488 kubelet[2623]: I0904 17:54:01.193902 2623 topology_manager.go:215] "Topology Admit Handler" podUID="d75f0bb3-43d7-4901-a7b6-5a7cef236d47" podNamespace="kube-system" podName="coredns-5dd5756b68-zk8pk" Sep 4 17:54:01.222663 containerd[1449]: time="2024-09-04T17:54:01.222320907Z" level=warning msg="cleanup warnings time=\"2024-09-04T17:54:01Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Sep 4 17:54:01.226902 kubelet[2623]: I0904 17:54:01.226853 2623 topology_manager.go:215] "Topology Admit Handler" podUID="82b31caf-8658-42af-9e3a-ea4def2ad1f0" podNamespace="calico-system" podName="calico-kube-controllers-8d44d4874-qmf4t" Sep 4 17:54:01.228156 systemd[1]: Created slice kubepods-burstable-pod8ad7d24b_d7a2_481f_abff_0aa847f03938.slice - libcontainer container kubepods-burstable-pod8ad7d24b_d7a2_481f_abff_0aa847f03938.slice. Sep 4 17:54:01.239913 systemd[1]: Created slice kubepods-burstable-podd75f0bb3_43d7_4901_a7b6_5a7cef236d47.slice - libcontainer container kubepods-burstable-podd75f0bb3_43d7_4901_a7b6_5a7cef236d47.slice. Sep 4 17:54:01.252203 systemd[1]: Created slice kubepods-besteffort-pod82b31caf_8658_42af_9e3a_ea4def2ad1f0.slice - libcontainer container kubepods-besteffort-pod82b31caf_8658_42af_9e3a_ea4def2ad1f0.slice. Sep 4 17:54:01.255621 kubelet[2623]: I0904 17:54:01.255397 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwvzq\" (UniqueName: \"kubernetes.io/projected/82b31caf-8658-42af-9e3a-ea4def2ad1f0-kube-api-access-zwvzq\") pod \"calico-kube-controllers-8d44d4874-qmf4t\" (UID: \"82b31caf-8658-42af-9e3a-ea4def2ad1f0\") " pod="calico-system/calico-kube-controllers-8d44d4874-qmf4t" Sep 4 17:54:01.255621 kubelet[2623]: I0904 17:54:01.255459 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82b31caf-8658-42af-9e3a-ea4def2ad1f0-tigera-ca-bundle\") pod \"calico-kube-controllers-8d44d4874-qmf4t\" (UID: \"82b31caf-8658-42af-9e3a-ea4def2ad1f0\") " pod="calico-system/calico-kube-controllers-8d44d4874-qmf4t" Sep 4 17:54:01.255621 kubelet[2623]: I0904 17:54:01.255494 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kslx\" (UniqueName: \"kubernetes.io/projected/8ad7d24b-d7a2-481f-abff-0aa847f03938-kube-api-access-2kslx\") pod \"coredns-5dd5756b68-4dlsd\" (UID: \"8ad7d24b-d7a2-481f-abff-0aa847f03938\") " pod="kube-system/coredns-5dd5756b68-4dlsd" Sep 4 17:54:01.255621 kubelet[2623]: I0904 17:54:01.255535 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d75f0bb3-43d7-4901-a7b6-5a7cef236d47-config-volume\") pod \"coredns-5dd5756b68-zk8pk\" (UID: \"d75f0bb3-43d7-4901-a7b6-5a7cef236d47\") " pod="kube-system/coredns-5dd5756b68-zk8pk" Sep 4 17:54:01.255621 kubelet[2623]: I0904 17:54:01.255569 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx88v\" (UniqueName: \"kubernetes.io/projected/d75f0bb3-43d7-4901-a7b6-5a7cef236d47-kube-api-access-kx88v\") pod \"coredns-5dd5756b68-zk8pk\" (UID: \"d75f0bb3-43d7-4901-a7b6-5a7cef236d47\") " pod="kube-system/coredns-5dd5756b68-zk8pk" Sep 4 17:54:01.255881 kubelet[2623]: I0904 17:54:01.255600 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ad7d24b-d7a2-481f-abff-0aa847f03938-config-volume\") pod \"coredns-5dd5756b68-4dlsd\" (UID: \"8ad7d24b-d7a2-481f-abff-0aa847f03938\") " pod="kube-system/coredns-5dd5756b68-4dlsd" Sep 4 17:54:01.492153 systemd[1]: Created slice kubepods-besteffort-pod5658969e_2b8a_4734_8694_fff3696c8a14.slice - libcontainer container kubepods-besteffort-pod5658969e_2b8a_4734_8694_fff3696c8a14.slice. Sep 4 17:54:01.519705 containerd[1449]: time="2024-09-04T17:54:01.519589302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w4vzr,Uid:5658969e-2b8a-4734-8694-fff3696c8a14,Namespace:calico-system,Attempt:0,}" Sep 4 17:54:01.536792 containerd[1449]: time="2024-09-04T17:54:01.536710792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-4dlsd,Uid:8ad7d24b-d7a2-481f-abff-0aa847f03938,Namespace:kube-system,Attempt:0,}" Sep 4 17:54:01.556350 containerd[1449]: time="2024-09-04T17:54:01.556270017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-zk8pk,Uid:d75f0bb3-43d7-4901-a7b6-5a7cef236d47,Namespace:kube-system,Attempt:0,}" Sep 4 17:54:01.566156 containerd[1449]: time="2024-09-04T17:54:01.565207778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8d44d4874-qmf4t,Uid:82b31caf-8658-42af-9e3a-ea4def2ad1f0,Namespace:calico-system,Attempt:0,}" Sep 4 17:54:01.782478 containerd[1449]: time="2024-09-04T17:54:01.782335630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.28.1\"" Sep 4 17:54:02.001216 containerd[1449]: time="2024-09-04T17:54:02.000999223Z" level=error msg="Failed to destroy network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.001216 containerd[1449]: time="2024-09-04T17:54:02.001073726Z" level=error msg="Failed to destroy network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.003488 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d-shm.mount: Deactivated successfully. Sep 4 17:54:02.003621 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63-shm.mount: Deactivated successfully. Sep 4 17:54:02.009380 containerd[1449]: time="2024-09-04T17:54:02.009023972Z" level=error msg="encountered an error cleaning up failed sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.009380 containerd[1449]: time="2024-09-04T17:54:02.009146590Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-zk8pk,Uid:d75f0bb3-43d7-4901-a7b6-5a7cef236d47,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.010101 containerd[1449]: time="2024-09-04T17:54:02.010051241Z" level=error msg="encountered an error cleaning up failed sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.010188 containerd[1449]: time="2024-09-04T17:54:02.010148204Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w4vzr,Uid:5658969e-2b8a-4734-8694-fff3696c8a14,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.017669 containerd[1449]: time="2024-09-04T17:54:02.017492110Z" level=error msg="Failed to destroy network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.019087 containerd[1449]: time="2024-09-04T17:54:02.018308583Z" level=error msg="encountered an error cleaning up failed sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.019087 containerd[1449]: time="2024-09-04T17:54:02.018411556Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8d44d4874-qmf4t,Uid:82b31caf-8658-42af-9e3a-ea4def2ad1f0,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.019087 containerd[1449]: time="2024-09-04T17:54:02.018919040Z" level=error msg="Failed to destroy network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.020268 kubelet[2623]: E0904 17:54:02.019454 2623 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.020268 kubelet[2623]: E0904 17:54:02.019526 2623 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8d44d4874-qmf4t" Sep 4 17:54:02.020268 kubelet[2623]: E0904 17:54:02.019558 2623 kuberuntime_manager.go:1171] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8d44d4874-qmf4t" Sep 4 17:54:02.022215 kubelet[2623]: E0904 17:54:02.019626 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8d44d4874-qmf4t_calico-system(82b31caf-8658-42af-9e3a-ea4def2ad1f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8d44d4874-qmf4t_calico-system(82b31caf-8658-42af-9e3a-ea4def2ad1f0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8d44d4874-qmf4t" podUID="82b31caf-8658-42af-9e3a-ea4def2ad1f0" Sep 4 17:54:02.022215 kubelet[2623]: E0904 17:54:02.019979 2623 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.022215 kubelet[2623]: E0904 17:54:02.020009 2623 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5dd5756b68-zk8pk" Sep 4 17:54:02.025313 containerd[1449]: time="2024-09-04T17:54:02.021217933Z" level=error msg="encountered an error cleaning up failed sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.025313 containerd[1449]: time="2024-09-04T17:54:02.021303255Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-4dlsd,Uid:8ad7d24b-d7a2-481f-abff-0aa847f03938,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.024174 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3-shm.mount: Deactivated successfully. Sep 4 17:54:02.026095 kubelet[2623]: E0904 17:54:02.020033 2623 kuberuntime_manager.go:1171] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5dd5756b68-zk8pk" Sep 4 17:54:02.026095 kubelet[2623]: E0904 17:54:02.020089 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-5dd5756b68-zk8pk_kube-system(d75f0bb3-43d7-4901-a7b6-5a7cef236d47)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-5dd5756b68-zk8pk_kube-system(d75f0bb3-43d7-4901-a7b6-5a7cef236d47)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5dd5756b68-zk8pk" podUID="d75f0bb3-43d7-4901-a7b6-5a7cef236d47" Sep 4 17:54:02.026095 kubelet[2623]: E0904 17:54:02.020130 2623 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.026230 kubelet[2623]: E0904 17:54:02.020152 2623 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-w4vzr" Sep 4 17:54:02.026230 kubelet[2623]: E0904 17:54:02.020172 2623 kuberuntime_manager.go:1171] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-w4vzr" Sep 4 17:54:02.026230 kubelet[2623]: E0904 17:54:02.020212 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-w4vzr_calico-system(5658969e-2b8a-4734-8694-fff3696c8a14)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-w4vzr_calico-system(5658969e-2b8a-4734-8694-fff3696c8a14)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:54:02.026345 kubelet[2623]: E0904 17:54:02.022157 2623 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.026345 kubelet[2623]: E0904 17:54:02.024990 2623 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5dd5756b68-4dlsd" Sep 4 17:54:02.026345 kubelet[2623]: E0904 17:54:02.025059 2623 kuberuntime_manager.go:1171] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-5dd5756b68-4dlsd" Sep 4 17:54:02.026431 kubelet[2623]: E0904 17:54:02.025490 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-5dd5756b68-4dlsd_kube-system(8ad7d24b-d7a2-481f-abff-0aa847f03938)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-5dd5756b68-4dlsd_kube-system(8ad7d24b-d7a2-481f-abff-0aa847f03938)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5dd5756b68-4dlsd" podUID="8ad7d24b-d7a2-481f-abff-0aa847f03938" Sep 4 17:54:02.027829 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd-shm.mount: Deactivated successfully. Sep 4 17:54:02.788871 kubelet[2623]: I0904 17:54:02.786089 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:02.789989 containerd[1449]: time="2024-09-04T17:54:02.789862900Z" level=info msg="StopPodSandbox for \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\"" Sep 4 17:54:02.795729 containerd[1449]: time="2024-09-04T17:54:02.794434119Z" level=info msg="Ensure that sandbox 7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63 in task-service has been cleanup successfully" Sep 4 17:54:02.799556 kubelet[2623]: I0904 17:54:02.798601 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:02.803897 containerd[1449]: time="2024-09-04T17:54:02.803790699Z" level=info msg="StopPodSandbox for \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\"" Sep 4 17:54:02.804261 kubelet[2623]: I0904 17:54:02.802145 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:02.804482 containerd[1449]: time="2024-09-04T17:54:02.804168030Z" level=info msg="Ensure that sandbox 602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3 in task-service has been cleanup successfully" Sep 4 17:54:02.805184 containerd[1449]: time="2024-09-04T17:54:02.805096684Z" level=info msg="StopPodSandbox for \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\"" Sep 4 17:54:02.805556 containerd[1449]: time="2024-09-04T17:54:02.805466753Z" level=info msg="Ensure that sandbox 2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d in task-service has been cleanup successfully" Sep 4 17:54:02.821965 kubelet[2623]: I0904 17:54:02.819244 2623 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:02.824140 containerd[1449]: time="2024-09-04T17:54:02.823872555Z" level=info msg="StopPodSandbox for \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\"" Sep 4 17:54:02.828055 containerd[1449]: time="2024-09-04T17:54:02.827243027Z" level=info msg="Ensure that sandbox cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd in task-service has been cleanup successfully" Sep 4 17:54:02.900096 containerd[1449]: time="2024-09-04T17:54:02.900038951Z" level=error msg="StopPodSandbox for \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\" failed" error="failed to destroy network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.902355 kubelet[2623]: E0904 17:54:02.902146 2623 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:02.902355 kubelet[2623]: E0904 17:54:02.902241 2623 kuberuntime_manager.go:1380] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd"} Sep 4 17:54:02.902355 kubelet[2623]: E0904 17:54:02.902290 2623 kuberuntime_manager.go:1080] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8ad7d24b-d7a2-481f-abff-0aa847f03938\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 4 17:54:02.902355 kubelet[2623]: E0904 17:54:02.902327 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8ad7d24b-d7a2-481f-abff-0aa847f03938\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5dd5756b68-4dlsd" podUID="8ad7d24b-d7a2-481f-abff-0aa847f03938" Sep 4 17:54:02.902877 containerd[1449]: time="2024-09-04T17:54:02.902742766Z" level=error msg="StopPodSandbox for \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\" failed" error="failed to destroy network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.902877 containerd[1449]: time="2024-09-04T17:54:02.902780813Z" level=error msg="StopPodSandbox for \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\" failed" error="failed to destroy network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.903219 kubelet[2623]: E0904 17:54:02.903031 2623 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:02.903219 kubelet[2623]: E0904 17:54:02.903063 2623 kuberuntime_manager.go:1380] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63"} Sep 4 17:54:02.903219 kubelet[2623]: E0904 17:54:02.903099 2623 kuberuntime_manager.go:1080] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"5658969e-2b8a-4734-8694-fff3696c8a14\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 4 17:54:02.903219 kubelet[2623]: E0904 17:54:02.903120 2623 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:02.903357 kubelet[2623]: E0904 17:54:02.903132 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"5658969e-2b8a-4734-8694-fff3696c8a14\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-w4vzr" podUID="5658969e-2b8a-4734-8694-fff3696c8a14" Sep 4 17:54:02.903357 kubelet[2623]: E0904 17:54:02.903162 2623 kuberuntime_manager.go:1380] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3"} Sep 4 17:54:02.903357 kubelet[2623]: E0904 17:54:02.903216 2623 kuberuntime_manager.go:1080] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"82b31caf-8658-42af-9e3a-ea4def2ad1f0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 4 17:54:02.903357 kubelet[2623]: E0904 17:54:02.903256 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"82b31caf-8658-42af-9e3a-ea4def2ad1f0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8d44d4874-qmf4t" podUID="82b31caf-8658-42af-9e3a-ea4def2ad1f0" Sep 4 17:54:02.908443 containerd[1449]: time="2024-09-04T17:54:02.907984599Z" level=error msg="StopPodSandbox for \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\" failed" error="failed to destroy network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:54:02.908575 kubelet[2623]: E0904 17:54:02.908266 2623 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:02.908575 kubelet[2623]: E0904 17:54:02.908312 2623 kuberuntime_manager.go:1380] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d"} Sep 4 17:54:02.908575 kubelet[2623]: E0904 17:54:02.908355 2623 kuberuntime_manager.go:1080] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d75f0bb3-43d7-4901-a7b6-5a7cef236d47\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 4 17:54:02.908575 kubelet[2623]: E0904 17:54:02.908388 2623 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d75f0bb3-43d7-4901-a7b6-5a7cef236d47\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-5dd5756b68-zk8pk" podUID="d75f0bb3-43d7-4901-a7b6-5a7cef236d47" Sep 4 17:54:10.044512 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1480461720.mount: Deactivated successfully. Sep 4 17:54:10.097994 containerd[1449]: time="2024-09-04T17:54:10.097836609Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:10.113823 containerd[1449]: time="2024-09-04T17:54:10.100060330Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.28.1: active requests=0, bytes read=117873564" Sep 4 17:54:10.113823 containerd[1449]: time="2024-09-04T17:54:10.112667848Z" level=info msg="ImageCreate event name:\"sha256:8bbeb9e1ee3287b8f750c10383f53fa1ec6f942aaea2a900f666d5e4e63cf4cc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:10.117894 containerd[1449]: time="2024-09-04T17:54:10.117860589Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:47908d8b3046dadd6fbea273ac5b0b9bb803cc7b58b9114c50bf7591767d2744\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:10.119540 containerd[1449]: time="2024-09-04T17:54:10.119475214Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.28.1\" with image id \"sha256:8bbeb9e1ee3287b8f750c10383f53fa1ec6f942aaea2a900f666d5e4e63cf4cc\", repo tag \"ghcr.io/flatcar/calico/node:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:47908d8b3046dadd6fbea273ac5b0b9bb803cc7b58b9114c50bf7591767d2744\", size \"117873426\" in 8.337062676s" Sep 4 17:54:10.119540 containerd[1449]: time="2024-09-04T17:54:10.119539409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.28.1\" returns image reference \"sha256:8bbeb9e1ee3287b8f750c10383f53fa1ec6f942aaea2a900f666d5e4e63cf4cc\"" Sep 4 17:54:10.156100 containerd[1449]: time="2024-09-04T17:54:10.156055874Z" level=info msg="CreateContainer within sandbox \"d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 17:54:10.185858 containerd[1449]: time="2024-09-04T17:54:10.185773771Z" level=info msg="CreateContainer within sandbox \"d71a643ab9c9826353c58c93119fff32ad7ae7e623b8fee1c605ffb31c0c26c0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c83f0c9004580aee850b032b930c583362e2ef701167bce4a589f618dc358fe0\"" Sep 4 17:54:10.186883 containerd[1449]: time="2024-09-04T17:54:10.186860736Z" level=info msg="StartContainer for \"c83f0c9004580aee850b032b930c583362e2ef701167bce4a589f618dc358fe0\"" Sep 4 17:54:10.236250 systemd[1]: Started cri-containerd-c83f0c9004580aee850b032b930c583362e2ef701167bce4a589f618dc358fe0.scope - libcontainer container c83f0c9004580aee850b032b930c583362e2ef701167bce4a589f618dc358fe0. Sep 4 17:54:10.512134 containerd[1449]: time="2024-09-04T17:54:10.509912760Z" level=info msg="StartContainer for \"c83f0c9004580aee850b032b930c583362e2ef701167bce4a589f618dc358fe0\" returns successfully" Sep 4 17:54:10.599909 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 17:54:10.601002 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 17:54:10.915949 kubelet[2623]: I0904 17:54:10.913171 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-88pjh" podStartSLOduration=1.9420630810000001 podCreationTimestamp="2024-09-04 17:53:45 +0000 UTC" firstStartedPulling="2024-09-04 17:53:46.148914227 +0000 UTC m=+22.911560272" lastFinishedPulling="2024-09-04 17:54:10.119972918 +0000 UTC m=+46.882618963" observedRunningTime="2024-09-04 17:54:10.90921278 +0000 UTC m=+47.671858835" watchObservedRunningTime="2024-09-04 17:54:10.913121772 +0000 UTC m=+47.675767817" Sep 4 17:54:11.902098 systemd[1]: run-containerd-runc-k8s.io-c83f0c9004580aee850b032b930c583362e2ef701167bce4a589f618dc358fe0-runc.oiovWS.mount: Deactivated successfully. Sep 4 17:54:12.709849 kernel: bpftool[3871]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 4 17:54:13.015972 systemd-networkd[1348]: vxlan.calico: Link UP Sep 4 17:54:13.015980 systemd-networkd[1348]: vxlan.calico: Gained carrier Sep 4 17:54:13.514458 containerd[1449]: time="2024-09-04T17:54:13.513820287Z" level=info msg="StopPodSandbox for \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\"" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:13.638 [INFO][3954] k8s.go 608: Cleaning up netns ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:13.639 [INFO][3954] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" iface="eth0" netns="/var/run/netns/cni-fb6f3788-8429-a5ad-f472-cedca2426059" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:13.639 [INFO][3954] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" iface="eth0" netns="/var/run/netns/cni-fb6f3788-8429-a5ad-f472-cedca2426059" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:13.644 [INFO][3954] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" iface="eth0" netns="/var/run/netns/cni-fb6f3788-8429-a5ad-f472-cedca2426059" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:13.644 [INFO][3954] k8s.go 615: Releasing IP address(es) ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:13.644 [INFO][3954] utils.go 188: Calico CNI releasing IP address ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:14.023 [INFO][3960] ipam_plugin.go 417: Releasing address using handleID ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:14.025 [INFO][3960] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:14.026 [INFO][3960] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:14.056 [WARNING][3960] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:14.056 [INFO][3960] ipam_plugin.go 445: Releasing address using workloadID ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:14.062 [INFO][3960] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:14.069785 containerd[1449]: 2024-09-04 17:54:14.066 [INFO][3954] k8s.go 621: Teardown processing complete. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:14.076737 containerd[1449]: time="2024-09-04T17:54:14.071578147Z" level=info msg="TearDown network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\" successfully" Sep 4 17:54:14.076737 containerd[1449]: time="2024-09-04T17:54:14.071631102Z" level=info msg="StopPodSandbox for \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\" returns successfully" Sep 4 17:54:14.076737 containerd[1449]: time="2024-09-04T17:54:14.074029484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-zk8pk,Uid:d75f0bb3-43d7-4901-a7b6-5a7cef236d47,Namespace:kube-system,Attempt:1,}" Sep 4 17:54:14.081635 systemd[1]: run-netns-cni\x2dfb6f3788\x2d8429\x2da5ad\x2df472\x2dcedca2426059.mount: Deactivated successfully. Sep 4 17:54:14.302326 systemd-networkd[1348]: cali4eb98d68eda: Link UP Sep 4 17:54:14.302898 systemd-networkd[1348]: cali4eb98d68eda: Gained carrier Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.192 [INFO][3966] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0 coredns-5dd5756b68- kube-system d75f0bb3-43d7-4901-a7b6-5a7cef236d47 725 0 2024-09-04 17:53:38 +0000 UTC map[k8s-app:kube-dns pod-template-hash:5dd5756b68 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4054-1-0-c-33e05803e0.novalocal coredns-5dd5756b68-zk8pk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4eb98d68eda [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.193 [INFO][3966] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.236 [INFO][3979] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" HandleID="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.250 [INFO][3979] ipam_plugin.go 270: Auto assigning IP ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" HandleID="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000265de0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4054-1-0-c-33e05803e0.novalocal", "pod":"coredns-5dd5756b68-zk8pk", "timestamp":"2024-09-04 17:54:14.236887961 +0000 UTC"}, Hostname:"ci-4054-1-0-c-33e05803e0.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.250 [INFO][3979] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.250 [INFO][3979] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.250 [INFO][3979] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4054-1-0-c-33e05803e0.novalocal' Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.254 [INFO][3979] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.264 [INFO][3979] ipam.go 372: Looking up existing affinities for host host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.273 [INFO][3979] ipam.go 489: Trying affinity for 192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.276 [INFO][3979] ipam.go 155: Attempting to load block cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.282 [INFO][3979] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.282 [INFO][3979] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.192/26 handle="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.285 [INFO][3979] ipam.go 1685: Creating new handle: k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.289 [INFO][3979] ipam.go 1203: Writing block in order to claim IPs block=192.168.13.192/26 handle="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.295 [INFO][3979] ipam.go 1216: Successfully claimed IPs: [192.168.13.193/26] block=192.168.13.192/26 handle="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.295 [INFO][3979] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.193/26] handle="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.295 [INFO][3979] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:14.322662 containerd[1449]: 2024-09-04 17:54:14.295 [INFO][3979] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.13.193/26] IPv6=[] ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" HandleID="k8s-pod-network.3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.324175 containerd[1449]: 2024-09-04 17:54:14.298 [INFO][3966] k8s.go 386: Populated endpoint ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"d75f0bb3-43d7-4901-a7b6-5a7cef236d47", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"", Pod:"coredns-5dd5756b68-zk8pk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4eb98d68eda", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:14.324175 containerd[1449]: 2024-09-04 17:54:14.299 [INFO][3966] k8s.go 387: Calico CNI using IPs: [192.168.13.193/32] ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.324175 containerd[1449]: 2024-09-04 17:54:14.299 [INFO][3966] dataplane_linux.go 68: Setting the host side veth name to cali4eb98d68eda ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.324175 containerd[1449]: 2024-09-04 17:54:14.303 [INFO][3966] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.324175 containerd[1449]: 2024-09-04 17:54:14.304 [INFO][3966] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"d75f0bb3-43d7-4901-a7b6-5a7cef236d47", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a", Pod:"coredns-5dd5756b68-zk8pk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4eb98d68eda", MAC:"86:88:55:43:4a:44", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:14.324175 containerd[1449]: 2024-09-04 17:54:14.318 [INFO][3966] k8s.go 500: Wrote updated endpoint to datastore ContainerID="3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a" Namespace="kube-system" Pod="coredns-5dd5756b68-zk8pk" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:14.378617 containerd[1449]: time="2024-09-04T17:54:14.378502974Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:54:14.378617 containerd[1449]: time="2024-09-04T17:54:14.378587336Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:54:14.378978 containerd[1449]: time="2024-09-04T17:54:14.378601622Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:14.378978 containerd[1449]: time="2024-09-04T17:54:14.378722349Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:14.407130 systemd[1]: Started cri-containerd-3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a.scope - libcontainer container 3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a. Sep 4 17:54:14.460528 containerd[1449]: time="2024-09-04T17:54:14.460403413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-zk8pk,Uid:d75f0bb3-43d7-4901-a7b6-5a7cef236d47,Namespace:kube-system,Attempt:1,} returns sandbox id \"3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a\"" Sep 4 17:54:14.517097 containerd[1449]: time="2024-09-04T17:54:14.517049504Z" level=info msg="CreateContainer within sandbox \"3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 17:54:14.674520 containerd[1449]: time="2024-09-04T17:54:14.674323342Z" level=info msg="CreateContainer within sandbox \"3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a338f9d32f0daa1307f4fc9f3a27ca1b4d8ca15197129144d1e0136eb7eb5e30\"" Sep 4 17:54:14.675374 containerd[1449]: time="2024-09-04T17:54:14.675330662Z" level=info msg="StartContainer for \"a338f9d32f0daa1307f4fc9f3a27ca1b4d8ca15197129144d1e0136eb7eb5e30\"" Sep 4 17:54:14.697039 systemd-networkd[1348]: vxlan.calico: Gained IPv6LL Sep 4 17:54:14.712939 systemd[1]: Started cri-containerd-a338f9d32f0daa1307f4fc9f3a27ca1b4d8ca15197129144d1e0136eb7eb5e30.scope - libcontainer container a338f9d32f0daa1307f4fc9f3a27ca1b4d8ca15197129144d1e0136eb7eb5e30. Sep 4 17:54:14.752345 containerd[1449]: time="2024-09-04T17:54:14.752273195Z" level=info msg="StartContainer for \"a338f9d32f0daa1307f4fc9f3a27ca1b4d8ca15197129144d1e0136eb7eb5e30\" returns successfully" Sep 4 17:54:15.074949 systemd[1]: run-containerd-runc-k8s.io-3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a-runc.u7jzqD.mount: Deactivated successfully. Sep 4 17:54:15.481882 containerd[1449]: time="2024-09-04T17:54:15.481254392Z" level=info msg="StopPodSandbox for \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\"" Sep 4 17:54:15.704853 kubelet[2623]: I0904 17:54:15.704253 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-5dd5756b68-zk8pk" podStartSLOduration=37.686891297 podCreationTimestamp="2024-09-04 17:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-09-04 17:54:14.926778981 +0000 UTC m=+51.689425026" watchObservedRunningTime="2024-09-04 17:54:15.686891297 +0000 UTC m=+52.449537452" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.681 [INFO][4091] k8s.go 608: Cleaning up netns ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.682 [INFO][4091] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" iface="eth0" netns="/var/run/netns/cni-4a615fed-594f-e282-81a7-9a2635d296b0" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.683 [INFO][4091] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" iface="eth0" netns="/var/run/netns/cni-4a615fed-594f-e282-81a7-9a2635d296b0" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.684 [INFO][4091] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" iface="eth0" netns="/var/run/netns/cni-4a615fed-594f-e282-81a7-9a2635d296b0" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.684 [INFO][4091] k8s.go 615: Releasing IP address(es) ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.685 [INFO][4091] utils.go 188: Calico CNI releasing IP address ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.744 [INFO][4097] ipam_plugin.go 417: Releasing address using handleID ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.744 [INFO][4097] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.744 [INFO][4097] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.758 [WARNING][4097] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.758 [INFO][4097] ipam_plugin.go 445: Releasing address using workloadID ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.760 [INFO][4097] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:15.765429 containerd[1449]: 2024-09-04 17:54:15.763 [INFO][4091] k8s.go 621: Teardown processing complete. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:15.770196 containerd[1449]: time="2024-09-04T17:54:15.765709826Z" level=info msg="TearDown network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\" successfully" Sep 4 17:54:15.770196 containerd[1449]: time="2024-09-04T17:54:15.765761019Z" level=info msg="StopPodSandbox for \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\" returns successfully" Sep 4 17:54:15.770196 containerd[1449]: time="2024-09-04T17:54:15.768530847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8d44d4874-qmf4t,Uid:82b31caf-8658-42af-9e3a-ea4def2ad1f0,Namespace:calico-system,Attempt:1,}" Sep 4 17:54:15.771602 systemd[1]: run-netns-cni\x2d4a615fed\x2d594f\x2de282\x2d81a7\x2d9a2635d296b0.mount: Deactivated successfully. Sep 4 17:54:15.977972 systemd-networkd[1348]: cali4eb98d68eda: Gained IPv6LL Sep 4 17:54:16.009037 systemd-networkd[1348]: calief57fd3ec64: Link UP Sep 4 17:54:16.010370 systemd-networkd[1348]: calief57fd3ec64: Gained carrier Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.847 [INFO][4108] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0 calico-kube-controllers-8d44d4874- calico-system 82b31caf-8658-42af-9e3a-ea4def2ad1f0 741 0 2024-09-04 17:53:45 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8d44d4874 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4054-1-0-c-33e05803e0.novalocal calico-kube-controllers-8d44d4874-qmf4t eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calief57fd3ec64 [] []}} ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.847 [INFO][4108] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.897 [INFO][4115] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" HandleID="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.915 [INFO][4115] ipam_plugin.go 270: Auto assigning IP ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" HandleID="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000611600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4054-1-0-c-33e05803e0.novalocal", "pod":"calico-kube-controllers-8d44d4874-qmf4t", "timestamp":"2024-09-04 17:54:15.897703199 +0000 UTC"}, Hostname:"ci-4054-1-0-c-33e05803e0.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.915 [INFO][4115] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.915 [INFO][4115] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.915 [INFO][4115] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4054-1-0-c-33e05803e0.novalocal' Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.919 [INFO][4115] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.943 [INFO][4115] ipam.go 372: Looking up existing affinities for host host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.958 [INFO][4115] ipam.go 489: Trying affinity for 192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.971 [INFO][4115] ipam.go 155: Attempting to load block cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.976 [INFO][4115] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.976 [INFO][4115] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.192/26 handle="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.982 [INFO][4115] ipam.go 1685: Creating new handle: k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183 Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:15.995 [INFO][4115] ipam.go 1203: Writing block in order to claim IPs block=192.168.13.192/26 handle="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:16.002 [INFO][4115] ipam.go 1216: Successfully claimed IPs: [192.168.13.194/26] block=192.168.13.192/26 handle="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:16.002 [INFO][4115] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.194/26] handle="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:16.002 [INFO][4115] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:16.032218 containerd[1449]: 2024-09-04 17:54:16.002 [INFO][4115] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.13.194/26] IPv6=[] ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" HandleID="k8s-pod-network.fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:16.032901 containerd[1449]: 2024-09-04 17:54:16.005 [INFO][4108] k8s.go 386: Populated endpoint ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0", GenerateName:"calico-kube-controllers-8d44d4874-", Namespace:"calico-system", SelfLink:"", UID:"82b31caf-8658-42af-9e3a-ea4def2ad1f0", ResourceVersion:"741", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8d44d4874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"", Pod:"calico-kube-controllers-8d44d4874-qmf4t", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.13.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calief57fd3ec64", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:16.032901 containerd[1449]: 2024-09-04 17:54:16.005 [INFO][4108] k8s.go 387: Calico CNI using IPs: [192.168.13.194/32] ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:16.032901 containerd[1449]: 2024-09-04 17:54:16.005 [INFO][4108] dataplane_linux.go 68: Setting the host side veth name to calief57fd3ec64 ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:16.032901 containerd[1449]: 2024-09-04 17:54:16.009 [INFO][4108] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:16.032901 containerd[1449]: 2024-09-04 17:54:16.010 [INFO][4108] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0", GenerateName:"calico-kube-controllers-8d44d4874-", Namespace:"calico-system", SelfLink:"", UID:"82b31caf-8658-42af-9e3a-ea4def2ad1f0", ResourceVersion:"741", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8d44d4874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183", Pod:"calico-kube-controllers-8d44d4874-qmf4t", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.13.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calief57fd3ec64", MAC:"32:c6:70:ff:03:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:16.032901 containerd[1449]: 2024-09-04 17:54:16.023 [INFO][4108] k8s.go 500: Wrote updated endpoint to datastore ContainerID="fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183" Namespace="calico-system" Pod="calico-kube-controllers-8d44d4874-qmf4t" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:16.100821 containerd[1449]: time="2024-09-04T17:54:16.099361818Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:54:16.100821 containerd[1449]: time="2024-09-04T17:54:16.099429009Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:54:16.100821 containerd[1449]: time="2024-09-04T17:54:16.099457971Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:16.100821 containerd[1449]: time="2024-09-04T17:54:16.099555719Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:16.141049 systemd[1]: Started cri-containerd-fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183.scope - libcontainer container fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183. Sep 4 17:54:16.193419 containerd[1449]: time="2024-09-04T17:54:16.193361464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8d44d4874-qmf4t,Uid:82b31caf-8658-42af-9e3a-ea4def2ad1f0,Namespace:calico-system,Attempt:1,} returns sandbox id \"fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183\"" Sep 4 17:54:16.204335 containerd[1449]: time="2024-09-04T17:54:16.204037976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\"" Sep 4 17:54:17.387236 systemd-networkd[1348]: calief57fd3ec64: Gained IPv6LL Sep 4 17:54:17.481454 containerd[1449]: time="2024-09-04T17:54:17.481135489Z" level=info msg="StopPodSandbox for \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\"" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.538 [INFO][4191] k8s.go 608: Cleaning up netns ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.538 [INFO][4191] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" iface="eth0" netns="/var/run/netns/cni-9287c8e7-3a90-c1ad-c04e-f2c46c406d69" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.539 [INFO][4191] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" iface="eth0" netns="/var/run/netns/cni-9287c8e7-3a90-c1ad-c04e-f2c46c406d69" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.539 [INFO][4191] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" iface="eth0" netns="/var/run/netns/cni-9287c8e7-3a90-c1ad-c04e-f2c46c406d69" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.539 [INFO][4191] k8s.go 615: Releasing IP address(es) ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.539 [INFO][4191] utils.go 188: Calico CNI releasing IP address ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.580 [INFO][4198] ipam_plugin.go 417: Releasing address using handleID ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.580 [INFO][4198] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.581 [INFO][4198] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.591 [WARNING][4198] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.591 [INFO][4198] ipam_plugin.go 445: Releasing address using workloadID ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.593 [INFO][4198] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:17.598506 containerd[1449]: 2024-09-04 17:54:17.594 [INFO][4191] k8s.go 621: Teardown processing complete. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:17.602293 containerd[1449]: time="2024-09-04T17:54:17.599913331Z" level=info msg="TearDown network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\" successfully" Sep 4 17:54:17.602293 containerd[1449]: time="2024-09-04T17:54:17.599949227Z" level=info msg="StopPodSandbox for \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\" returns successfully" Sep 4 17:54:17.604784 containerd[1449]: time="2024-09-04T17:54:17.602717451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-4dlsd,Uid:8ad7d24b-d7a2-481f-abff-0aa847f03938,Namespace:kube-system,Attempt:1,}" Sep 4 17:54:17.603230 systemd[1]: run-netns-cni\x2d9287c8e7\x2d3a90\x2dc1ad\x2dc04e\x2df2c46c406d69.mount: Deactivated successfully. Sep 4 17:54:17.787545 systemd-networkd[1348]: cali9dcacce7f7a: Link UP Sep 4 17:54:17.787732 systemd-networkd[1348]: cali9dcacce7f7a: Gained carrier Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.687 [INFO][4205] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0 coredns-5dd5756b68- kube-system 8ad7d24b-d7a2-481f-abff-0aa847f03938 754 0 2024-09-04 17:53:38 +0000 UTC map[k8s-app:kube-dns pod-template-hash:5dd5756b68 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4054-1-0-c-33e05803e0.novalocal coredns-5dd5756b68-4dlsd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9dcacce7f7a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.687 [INFO][4205] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.731 [INFO][4215] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" HandleID="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.742 [INFO][4215] ipam_plugin.go 270: Auto assigning IP ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" HandleID="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000265150), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4054-1-0-c-33e05803e0.novalocal", "pod":"coredns-5dd5756b68-4dlsd", "timestamp":"2024-09-04 17:54:17.731048858 +0000 UTC"}, Hostname:"ci-4054-1-0-c-33e05803e0.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.742 [INFO][4215] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.742 [INFO][4215] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.742 [INFO][4215] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4054-1-0-c-33e05803e0.novalocal' Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.745 [INFO][4215] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.750 [INFO][4215] ipam.go 372: Looking up existing affinities for host host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.755 [INFO][4215] ipam.go 489: Trying affinity for 192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.757 [INFO][4215] ipam.go 155: Attempting to load block cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.759 [INFO][4215] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.759 [INFO][4215] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.192/26 handle="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.761 [INFO][4215] ipam.go 1685: Creating new handle: k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.769 [INFO][4215] ipam.go 1203: Writing block in order to claim IPs block=192.168.13.192/26 handle="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.778 [INFO][4215] ipam.go 1216: Successfully claimed IPs: [192.168.13.195/26] block=192.168.13.192/26 handle="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.778 [INFO][4215] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.195/26] handle="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.778 [INFO][4215] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:17.807744 containerd[1449]: 2024-09-04 17:54:17.778 [INFO][4215] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.13.195/26] IPv6=[] ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" HandleID="k8s-pod-network.3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.809873 containerd[1449]: 2024-09-04 17:54:17.781 [INFO][4205] k8s.go 386: Populated endpoint ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"8ad7d24b-d7a2-481f-abff-0aa847f03938", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"", Pod:"coredns-5dd5756b68-4dlsd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9dcacce7f7a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:17.809873 containerd[1449]: 2024-09-04 17:54:17.782 [INFO][4205] k8s.go 387: Calico CNI using IPs: [192.168.13.195/32] ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.809873 containerd[1449]: 2024-09-04 17:54:17.782 [INFO][4205] dataplane_linux.go 68: Setting the host side veth name to cali9dcacce7f7a ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.809873 containerd[1449]: 2024-09-04 17:54:17.785 [INFO][4205] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.809873 containerd[1449]: 2024-09-04 17:54:17.785 [INFO][4205] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"8ad7d24b-d7a2-481f-abff-0aa847f03938", ResourceVersion:"754", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e", Pod:"coredns-5dd5756b68-4dlsd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9dcacce7f7a", MAC:"b2:56:46:01:3d:83", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:17.809873 containerd[1449]: 2024-09-04 17:54:17.800 [INFO][4205] k8s.go 500: Wrote updated endpoint to datastore ContainerID="3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e" Namespace="kube-system" Pod="coredns-5dd5756b68-4dlsd" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:17.876958 containerd[1449]: time="2024-09-04T17:54:17.876107966Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:54:17.876958 containerd[1449]: time="2024-09-04T17:54:17.876167253Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:54:17.876958 containerd[1449]: time="2024-09-04T17:54:17.876198359Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:17.876958 containerd[1449]: time="2024-09-04T17:54:17.876290636Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:17.925094 systemd[1]: Started cri-containerd-3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e.scope - libcontainer container 3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e. Sep 4 17:54:17.987573 containerd[1449]: time="2024-09-04T17:54:17.987492205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-5dd5756b68-4dlsd,Uid:8ad7d24b-d7a2-481f-abff-0aa847f03938,Namespace:kube-system,Attempt:1,} returns sandbox id \"3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e\"" Sep 4 17:54:17.992580 containerd[1449]: time="2024-09-04T17:54:17.992153770Z" level=info msg="CreateContainer within sandbox \"3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 17:54:18.052182 containerd[1449]: time="2024-09-04T17:54:18.051471807Z" level=info msg="CreateContainer within sandbox \"3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ac27e0156f24538955cd4e2fafaf0b3add35134d145dc6521cb64b4d48dfb8c7\"" Sep 4 17:54:18.054374 containerd[1449]: time="2024-09-04T17:54:18.053212603Z" level=info msg="StartContainer for \"ac27e0156f24538955cd4e2fafaf0b3add35134d145dc6521cb64b4d48dfb8c7\"" Sep 4 17:54:18.110953 systemd[1]: Started cri-containerd-ac27e0156f24538955cd4e2fafaf0b3add35134d145dc6521cb64b4d48dfb8c7.scope - libcontainer container ac27e0156f24538955cd4e2fafaf0b3add35134d145dc6521cb64b4d48dfb8c7. Sep 4 17:54:18.168514 containerd[1449]: time="2024-09-04T17:54:18.168270750Z" level=info msg="StartContainer for \"ac27e0156f24538955cd4e2fafaf0b3add35134d145dc6521cb64b4d48dfb8c7\" returns successfully" Sep 4 17:54:18.480920 containerd[1449]: time="2024-09-04T17:54:18.480828696Z" level=info msg="StopPodSandbox for \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\"" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.543 [INFO][4334] k8s.go 608: Cleaning up netns ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.543 [INFO][4334] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" iface="eth0" netns="/var/run/netns/cni-3e57bec4-fb71-5898-6e4d-1abfed3d6044" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.543 [INFO][4334] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" iface="eth0" netns="/var/run/netns/cni-3e57bec4-fb71-5898-6e4d-1abfed3d6044" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.544 [INFO][4334] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" iface="eth0" netns="/var/run/netns/cni-3e57bec4-fb71-5898-6e4d-1abfed3d6044" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.544 [INFO][4334] k8s.go 615: Releasing IP address(es) ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.544 [INFO][4334] utils.go 188: Calico CNI releasing IP address ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.570 [INFO][4341] ipam_plugin.go 417: Releasing address using handleID ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.570 [INFO][4341] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.570 [INFO][4341] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.578 [WARNING][4341] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.578 [INFO][4341] ipam_plugin.go 445: Releasing address using workloadID ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.581 [INFO][4341] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:18.587438 containerd[1449]: 2024-09-04 17:54:18.584 [INFO][4334] k8s.go 621: Teardown processing complete. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:18.588940 containerd[1449]: time="2024-09-04T17:54:18.587907005Z" level=info msg="TearDown network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\" successfully" Sep 4 17:54:18.588940 containerd[1449]: time="2024-09-04T17:54:18.587940836Z" level=info msg="StopPodSandbox for \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\" returns successfully" Sep 4 17:54:18.589312 containerd[1449]: time="2024-09-04T17:54:18.589288719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w4vzr,Uid:5658969e-2b8a-4734-8694-fff3696c8a14,Namespace:calico-system,Attempt:1,}" Sep 4 17:54:18.608322 systemd[1]: run-containerd-runc-k8s.io-3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e-runc.Pp35ob.mount: Deactivated successfully. Sep 4 17:54:18.608448 systemd[1]: run-netns-cni\x2d3e57bec4\x2dfb71\x2d5898\x2d6e4d\x2d1abfed3d6044.mount: Deactivated successfully. Sep 4 17:54:18.778897 systemd-networkd[1348]: cali827b29a1f6d: Link UP Sep 4 17:54:18.779143 systemd-networkd[1348]: cali827b29a1f6d: Gained carrier Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.674 [INFO][4348] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0 csi-node-driver- calico-system 5658969e-2b8a-4734-8694-fff3696c8a14 763 0 2024-09-04 17:53:45 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78cd84fb8c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s ci-4054-1-0-c-33e05803e0.novalocal csi-node-driver-w4vzr eth0 default [] [] [kns.calico-system ksa.calico-system.default] cali827b29a1f6d [] []}} ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.674 [INFO][4348] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.725 [INFO][4358] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" HandleID="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.738 [INFO][4358] ipam_plugin.go 270: Auto assigning IP ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" HandleID="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001fd860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4054-1-0-c-33e05803e0.novalocal", "pod":"csi-node-driver-w4vzr", "timestamp":"2024-09-04 17:54:18.725253499 +0000 UTC"}, Hostname:"ci-4054-1-0-c-33e05803e0.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.739 [INFO][4358] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.739 [INFO][4358] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.739 [INFO][4358] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4054-1-0-c-33e05803e0.novalocal' Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.741 [INFO][4358] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.746 [INFO][4358] ipam.go 372: Looking up existing affinities for host host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.751 [INFO][4358] ipam.go 489: Trying affinity for 192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.753 [INFO][4358] ipam.go 155: Attempting to load block cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.755 [INFO][4358] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.756 [INFO][4358] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.192/26 handle="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.757 [INFO][4358] ipam.go 1685: Creating new handle: k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.763 [INFO][4358] ipam.go 1203: Writing block in order to claim IPs block=192.168.13.192/26 handle="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.769 [INFO][4358] ipam.go 1216: Successfully claimed IPs: [192.168.13.196/26] block=192.168.13.192/26 handle="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.770 [INFO][4358] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.196/26] handle="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.770 [INFO][4358] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:18.811096 containerd[1449]: 2024-09-04 17:54:18.770 [INFO][4358] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.13.196/26] IPv6=[] ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" HandleID="k8s-pod-network.379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.812494 containerd[1449]: 2024-09-04 17:54:18.772 [INFO][4348] k8s.go 386: Populated endpoint ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5658969e-2b8a-4734-8694-fff3696c8a14", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"", Pod:"csi-node-driver-w4vzr", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.13.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali827b29a1f6d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:18.812494 containerd[1449]: 2024-09-04 17:54:18.773 [INFO][4348] k8s.go 387: Calico CNI using IPs: [192.168.13.196/32] ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.812494 containerd[1449]: 2024-09-04 17:54:18.773 [INFO][4348] dataplane_linux.go 68: Setting the host side veth name to cali827b29a1f6d ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.812494 containerd[1449]: 2024-09-04 17:54:18.783 [INFO][4348] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.812494 containerd[1449]: 2024-09-04 17:54:18.783 [INFO][4348] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5658969e-2b8a-4734-8694-fff3696c8a14", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad", Pod:"csi-node-driver-w4vzr", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.13.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali827b29a1f6d", MAC:"56:a0:e2:d8:9b:f9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:18.812494 containerd[1449]: 2024-09-04 17:54:18.805 [INFO][4348] k8s.go 500: Wrote updated endpoint to datastore ContainerID="379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad" Namespace="calico-system" Pod="csi-node-driver-w4vzr" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:18.869762 containerd[1449]: time="2024-09-04T17:54:18.867784312Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:54:18.869762 containerd[1449]: time="2024-09-04T17:54:18.868535013Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:54:18.869762 containerd[1449]: time="2024-09-04T17:54:18.868578893Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:18.869762 containerd[1449]: time="2024-09-04T17:54:18.868834296Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:18.918971 systemd[1]: Started cri-containerd-379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad.scope - libcontainer container 379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad. Sep 4 17:54:18.979502 containerd[1449]: time="2024-09-04T17:54:18.979355085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-w4vzr,Uid:5658969e-2b8a-4734-8694-fff3696c8a14,Namespace:calico-system,Attempt:1,} returns sandbox id \"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad\"" Sep 4 17:54:19.015262 kubelet[2623]: I0904 17:54:19.015225 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-5dd5756b68-4dlsd" podStartSLOduration=41.015179967 podCreationTimestamp="2024-09-04 17:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-09-04 17:54:18.992946046 +0000 UTC m=+55.755592131" watchObservedRunningTime="2024-09-04 17:54:19.015179967 +0000 UTC m=+55.777826012" Sep 4 17:54:19.048975 systemd-networkd[1348]: cali9dcacce7f7a: Gained IPv6LL Sep 4 17:54:20.457165 systemd-networkd[1348]: cali827b29a1f6d: Gained IPv6LL Sep 4 17:54:20.737928 containerd[1449]: time="2024-09-04T17:54:20.736925919Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:20.739328 containerd[1449]: time="2024-09-04T17:54:20.739292827Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.28.1: active requests=0, bytes read=33507125" Sep 4 17:54:20.740664 containerd[1449]: time="2024-09-04T17:54:20.740619446Z" level=info msg="ImageCreate event name:\"sha256:9d19dff735fa0889ad6e741790dd1ff35dc4443f14c95bd61459ff0b9162252e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:20.744262 containerd[1449]: time="2024-09-04T17:54:20.744221397Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:9a7338f7187d4d2352fe49eedee44b191ac92557a2e71aa3de3527ed85c1641b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:20.745488 containerd[1449]: time="2024-09-04T17:54:20.745464054Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\" with image id \"sha256:9d19dff735fa0889ad6e741790dd1ff35dc4443f14c95bd61459ff0b9162252e\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:9a7338f7187d4d2352fe49eedee44b191ac92557a2e71aa3de3527ed85c1641b\", size \"34999494\" in 4.541351483s" Sep 4 17:54:20.745649 containerd[1449]: time="2024-09-04T17:54:20.745558436Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\" returns image reference \"sha256:9d19dff735fa0889ad6e741790dd1ff35dc4443f14c95bd61459ff0b9162252e\"" Sep 4 17:54:20.746171 containerd[1449]: time="2024-09-04T17:54:20.746152975Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.28.1\"" Sep 4 17:54:20.774147 containerd[1449]: time="2024-09-04T17:54:20.773894313Z" level=info msg="CreateContainer within sandbox \"fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 17:54:20.812415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount810379542.mount: Deactivated successfully. Sep 4 17:54:20.814563 containerd[1449]: time="2024-09-04T17:54:20.814507696Z" level=info msg="CreateContainer within sandbox \"fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"4c28a73d12bfea09e75fa62342c42c61bd470cd82148661bdd413cfbec4f44f1\"" Sep 4 17:54:20.816653 containerd[1449]: time="2024-09-04T17:54:20.815275350Z" level=info msg="StartContainer for \"4c28a73d12bfea09e75fa62342c42c61bd470cd82148661bdd413cfbec4f44f1\"" Sep 4 17:54:21.032995 systemd[1]: Started cri-containerd-4c28a73d12bfea09e75fa62342c42c61bd470cd82148661bdd413cfbec4f44f1.scope - libcontainer container 4c28a73d12bfea09e75fa62342c42c61bd470cd82148661bdd413cfbec4f44f1. Sep 4 17:54:21.255942 containerd[1449]: time="2024-09-04T17:54:21.255719523Z" level=info msg="StartContainer for \"4c28a73d12bfea09e75fa62342c42c61bd470cd82148661bdd413cfbec4f44f1\" returns successfully" Sep 4 17:54:22.130494 kubelet[2623]: I0904 17:54:22.130387 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8d44d4874-qmf4t" podStartSLOduration=32.580032337 podCreationTimestamp="2024-09-04 17:53:45 +0000 UTC" firstStartedPulling="2024-09-04 17:54:16.19563583 +0000 UTC m=+52.958281895" lastFinishedPulling="2024-09-04 17:54:20.745930532 +0000 UTC m=+57.508576586" observedRunningTime="2024-09-04 17:54:22.037543182 +0000 UTC m=+58.800189278" watchObservedRunningTime="2024-09-04 17:54:22.130327028 +0000 UTC m=+58.892973114" Sep 4 17:54:23.680003 containerd[1449]: time="2024-09-04T17:54:23.679703441Z" level=info msg="StopPodSandbox for \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\"" Sep 4 17:54:23.717481 containerd[1449]: time="2024-09-04T17:54:23.717324798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:23.718725 containerd[1449]: time="2024-09-04T17:54:23.718533207Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.28.1: active requests=0, bytes read=7642081" Sep 4 17:54:23.720623 containerd[1449]: time="2024-09-04T17:54:23.720585322Z" level=info msg="ImageCreate event name:\"sha256:d0c7782dfd1af19483b1da01b3d6692a92c2a570a3c8c6059128fda84c838a61\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:23.729963 containerd[1449]: time="2024-09-04T17:54:23.729841921Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:01e16d03dd0c29a8e1e302455eb15c2d0326c49cbaca4bbe8dc0e2d5308c5add\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:23.730807 containerd[1449]: time="2024-09-04T17:54:23.730501701Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.28.1\" with image id \"sha256:d0c7782dfd1af19483b1da01b3d6692a92c2a570a3c8c6059128fda84c838a61\", repo tag \"ghcr.io/flatcar/calico/csi:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:01e16d03dd0c29a8e1e302455eb15c2d0326c49cbaca4bbe8dc0e2d5308c5add\", size \"9134482\" in 2.984237324s" Sep 4 17:54:23.730807 containerd[1449]: time="2024-09-04T17:54:23.730541233Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.28.1\" returns image reference \"sha256:d0c7782dfd1af19483b1da01b3d6692a92c2a570a3c8c6059128fda84c838a61\"" Sep 4 17:54:23.736377 containerd[1449]: time="2024-09-04T17:54:23.736218947Z" level=info msg="CreateContainer within sandbox \"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 17:54:23.787680 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3265723845.mount: Deactivated successfully. Sep 4 17:54:23.798771 containerd[1449]: time="2024-09-04T17:54:23.798709791Z" level=info msg="CreateContainer within sandbox \"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"fd531c7e5e15bf0ec9499b05acb86e869166036b62c32cb642bd039acac3f65b\"" Sep 4 17:54:23.800463 containerd[1449]: time="2024-09-04T17:54:23.799523722Z" level=info msg="StartContainer for \"fd531c7e5e15bf0ec9499b05acb86e869166036b62c32cb642bd039acac3f65b\"" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.749 [WARNING][4518] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"d75f0bb3-43d7-4901-a7b6-5a7cef236d47", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a", Pod:"coredns-5dd5756b68-zk8pk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4eb98d68eda", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.750 [INFO][4518] k8s.go 608: Cleaning up netns ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.750 [INFO][4518] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" iface="eth0" netns="" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.750 [INFO][4518] k8s.go 615: Releasing IP address(es) ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.750 [INFO][4518] utils.go 188: Calico CNI releasing IP address ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.784 [INFO][4524] ipam_plugin.go 417: Releasing address using handleID ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.784 [INFO][4524] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.784 [INFO][4524] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.797 [WARNING][4524] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.797 [INFO][4524] ipam_plugin.go 445: Releasing address using workloadID ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.800 [INFO][4524] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:23.808875 containerd[1449]: 2024-09-04 17:54:23.803 [INFO][4518] k8s.go 621: Teardown processing complete. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.811788 containerd[1449]: time="2024-09-04T17:54:23.808945321Z" level=info msg="TearDown network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\" successfully" Sep 4 17:54:23.811788 containerd[1449]: time="2024-09-04T17:54:23.808969835Z" level=info msg="StopPodSandbox for \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\" returns successfully" Sep 4 17:54:23.812340 containerd[1449]: time="2024-09-04T17:54:23.812313732Z" level=info msg="RemovePodSandbox for \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\"" Sep 4 17:54:23.812409 containerd[1449]: time="2024-09-04T17:54:23.812346692Z" level=info msg="Forcibly stopping sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\"" Sep 4 17:54:23.851996 systemd[1]: Started cri-containerd-fd531c7e5e15bf0ec9499b05acb86e869166036b62c32cb642bd039acac3f65b.scope - libcontainer container fd531c7e5e15bf0ec9499b05acb86e869166036b62c32cb642bd039acac3f65b. Sep 4 17:54:23.908391 containerd[1449]: time="2024-09-04T17:54:23.908260330Z" level=info msg="StartContainer for \"fd531c7e5e15bf0ec9499b05acb86e869166036b62c32cb642bd039acac3f65b\" returns successfully" Sep 4 17:54:23.913340 containerd[1449]: time="2024-09-04T17:54:23.912890187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\"" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.885 [WARNING][4558] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"d75f0bb3-43d7-4901-a7b6-5a7cef236d47", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"3b4ecd181c417864f42566a82a05c1675d4e10dff51c23d306bf14565369552a", Pod:"coredns-5dd5756b68-zk8pk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4eb98d68eda", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.885 [INFO][4558] k8s.go 608: Cleaning up netns ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.886 [INFO][4558] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" iface="eth0" netns="" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.886 [INFO][4558] k8s.go 615: Releasing IP address(es) ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.886 [INFO][4558] utils.go 188: Calico CNI releasing IP address ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.923 [INFO][4579] ipam_plugin.go 417: Releasing address using handleID ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.923 [INFO][4579] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.923 [INFO][4579] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.933 [WARNING][4579] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.933 [INFO][4579] ipam_plugin.go 445: Releasing address using workloadID ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" HandleID="k8s-pod-network.2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--zk8pk-eth0" Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.936 [INFO][4579] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:23.941950 containerd[1449]: 2024-09-04 17:54:23.938 [INFO][4558] k8s.go 621: Teardown processing complete. ContainerID="2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d" Sep 4 17:54:23.941950 containerd[1449]: time="2024-09-04T17:54:23.941140567Z" level=info msg="TearDown network for sandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\" successfully" Sep 4 17:54:23.968406 containerd[1449]: time="2024-09-04T17:54:23.968370780Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 4 17:54:23.968715 containerd[1449]: time="2024-09-04T17:54:23.968543845Z" level=info msg="RemovePodSandbox \"2155b738e168e70cbe3fa297824e9867b5934ea483cabbabd20af4c52f94710d\" returns successfully" Sep 4 17:54:23.969441 containerd[1449]: time="2024-09-04T17:54:23.969173591Z" level=info msg="StopPodSandbox for \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\"" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.035 [WARNING][4606] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"8ad7d24b-d7a2-481f-abff-0aa847f03938", ResourceVersion:"772", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e", Pod:"coredns-5dd5756b68-4dlsd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9dcacce7f7a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.035 [INFO][4606] k8s.go 608: Cleaning up netns ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.035 [INFO][4606] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" iface="eth0" netns="" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.035 [INFO][4606] k8s.go 615: Releasing IP address(es) ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.035 [INFO][4606] utils.go 188: Calico CNI releasing IP address ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.062 [INFO][4613] ipam_plugin.go 417: Releasing address using handleID ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.062 [INFO][4613] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.062 [INFO][4613] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.071 [WARNING][4613] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.071 [INFO][4613] ipam_plugin.go 445: Releasing address using workloadID ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.077 [INFO][4613] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:24.079685 containerd[1449]: 2024-09-04 17:54:24.078 [INFO][4606] k8s.go 621: Teardown processing complete. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.081213 containerd[1449]: time="2024-09-04T17:54:24.080004475Z" level=info msg="TearDown network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\" successfully" Sep 4 17:54:24.081213 containerd[1449]: time="2024-09-04T17:54:24.080028818Z" level=info msg="StopPodSandbox for \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\" returns successfully" Sep 4 17:54:24.081213 containerd[1449]: time="2024-09-04T17:54:24.080630525Z" level=info msg="RemovePodSandbox for \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\"" Sep 4 17:54:24.081213 containerd[1449]: time="2024-09-04T17:54:24.080664797Z" level=info msg="Forcibly stopping sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\"" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.119 [WARNING][4631] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0", GenerateName:"coredns-5dd5756b68-", Namespace:"kube-system", SelfLink:"", UID:"8ad7d24b-d7a2-481f-abff-0aa847f03938", ResourceVersion:"772", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"5dd5756b68", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"3c0dffa8ac2a74eb72e5dbe12b9953f0412d3bfff76b6db0c4d3347a1e70dd3e", Pod:"coredns-5dd5756b68-4dlsd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.13.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9dcacce7f7a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.119 [INFO][4631] k8s.go 608: Cleaning up netns ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.119 [INFO][4631] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" iface="eth0" netns="" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.119 [INFO][4631] k8s.go 615: Releasing IP address(es) ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.119 [INFO][4631] utils.go 188: Calico CNI releasing IP address ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.140 [INFO][4637] ipam_plugin.go 417: Releasing address using handleID ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.140 [INFO][4637] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.140 [INFO][4637] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.147 [WARNING][4637] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.147 [INFO][4637] ipam_plugin.go 445: Releasing address using workloadID ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" HandleID="k8s-pod-network.cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-coredns--5dd5756b68--4dlsd-eth0" Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.149 [INFO][4637] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:24.152373 containerd[1449]: 2024-09-04 17:54:24.150 [INFO][4631] k8s.go 621: Teardown processing complete. ContainerID="cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd" Sep 4 17:54:24.152373 containerd[1449]: time="2024-09-04T17:54:24.152183506Z" level=info msg="TearDown network for sandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\" successfully" Sep 4 17:54:24.177128 containerd[1449]: time="2024-09-04T17:54:24.177066519Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 4 17:54:24.177575 containerd[1449]: time="2024-09-04T17:54:24.177457271Z" level=info msg="RemovePodSandbox \"cd5e4f174c4523d0675a56ba5fe21baacbee384ddaff9ae5a759f9cb178de4bd\" returns successfully" Sep 4 17:54:24.178023 containerd[1449]: time="2024-09-04T17:54:24.177972019Z" level=info msg="StopPodSandbox for \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\"" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.212 [WARNING][4655] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0", GenerateName:"calico-kube-controllers-8d44d4874-", Namespace:"calico-system", SelfLink:"", UID:"82b31caf-8658-42af-9e3a-ea4def2ad1f0", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8d44d4874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183", Pod:"calico-kube-controllers-8d44d4874-qmf4t", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.13.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calief57fd3ec64", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.213 [INFO][4655] k8s.go 608: Cleaning up netns ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.213 [INFO][4655] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" iface="eth0" netns="" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.213 [INFO][4655] k8s.go 615: Releasing IP address(es) ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.213 [INFO][4655] utils.go 188: Calico CNI releasing IP address ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.234 [INFO][4661] ipam_plugin.go 417: Releasing address using handleID ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.234 [INFO][4661] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.234 [INFO][4661] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.246 [WARNING][4661] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.247 [INFO][4661] ipam_plugin.go 445: Releasing address using workloadID ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.249 [INFO][4661] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:24.253175 containerd[1449]: 2024-09-04 17:54:24.250 [INFO][4655] k8s.go 621: Teardown processing complete. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.253175 containerd[1449]: time="2024-09-04T17:54:24.252808460Z" level=info msg="TearDown network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\" successfully" Sep 4 17:54:24.253175 containerd[1449]: time="2024-09-04T17:54:24.252832133Z" level=info msg="StopPodSandbox for \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\" returns successfully" Sep 4 17:54:24.254907 containerd[1449]: time="2024-09-04T17:54:24.253877657Z" level=info msg="RemovePodSandbox for \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\"" Sep 4 17:54:24.254907 containerd[1449]: time="2024-09-04T17:54:24.253905208Z" level=info msg="Forcibly stopping sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\"" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.315 [WARNING][4679] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0", GenerateName:"calico-kube-controllers-8d44d4874-", Namespace:"calico-system", SelfLink:"", UID:"82b31caf-8658-42af-9e3a-ea4def2ad1f0", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8d44d4874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"fe3232a6e53247c086cea188e29533f384b51ed67fdcc3cfa3b10c5e4cbe9183", Pod:"calico-kube-controllers-8d44d4874-qmf4t", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.13.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calief57fd3ec64", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.315 [INFO][4679] k8s.go 608: Cleaning up netns ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.315 [INFO][4679] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" iface="eth0" netns="" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.316 [INFO][4679] k8s.go 615: Releasing IP address(es) ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.316 [INFO][4679] utils.go 188: Calico CNI releasing IP address ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.355 [INFO][4685] ipam_plugin.go 417: Releasing address using handleID ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.355 [INFO][4685] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.355 [INFO][4685] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.362 [WARNING][4685] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.362 [INFO][4685] ipam_plugin.go 445: Releasing address using workloadID ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" HandleID="k8s-pod-network.602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--kube--controllers--8d44d4874--qmf4t-eth0" Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.364 [INFO][4685] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:24.369005 containerd[1449]: 2024-09-04 17:54:24.365 [INFO][4679] k8s.go 621: Teardown processing complete. ContainerID="602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3" Sep 4 17:54:24.370384 containerd[1449]: time="2024-09-04T17:54:24.369532940Z" level=info msg="TearDown network for sandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\" successfully" Sep 4 17:54:24.381944 containerd[1449]: time="2024-09-04T17:54:24.381014923Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 4 17:54:24.382273 containerd[1449]: time="2024-09-04T17:54:24.382171620Z" level=info msg="RemovePodSandbox \"602afb199695e23504393b0ce8ccc4a71a009dfdb5bcb8e0aa5edcb7b00b94b3\" returns successfully" Sep 4 17:54:24.385071 containerd[1449]: time="2024-09-04T17:54:24.385015921Z" level=info msg="StopPodSandbox for \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\"" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.429 [WARNING][4704] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5658969e-2b8a-4734-8694-fff3696c8a14", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad", Pod:"csi-node-driver-w4vzr", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.13.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali827b29a1f6d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.430 [INFO][4704] k8s.go 608: Cleaning up netns ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.430 [INFO][4704] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" iface="eth0" netns="" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.430 [INFO][4704] k8s.go 615: Releasing IP address(es) ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.430 [INFO][4704] utils.go 188: Calico CNI releasing IP address ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.466 [INFO][4710] ipam_plugin.go 417: Releasing address using handleID ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.466 [INFO][4710] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.467 [INFO][4710] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.474 [WARNING][4710] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.474 [INFO][4710] ipam_plugin.go 445: Releasing address using workloadID ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.477 [INFO][4710] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:24.481887 containerd[1449]: 2024-09-04 17:54:24.479 [INFO][4704] k8s.go 621: Teardown processing complete. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.482847 containerd[1449]: time="2024-09-04T17:54:24.481995516Z" level=info msg="TearDown network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\" successfully" Sep 4 17:54:24.482847 containerd[1449]: time="2024-09-04T17:54:24.482452077Z" level=info msg="StopPodSandbox for \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\" returns successfully" Sep 4 17:54:24.483200 containerd[1449]: time="2024-09-04T17:54:24.483135301Z" level=info msg="RemovePodSandbox for \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\"" Sep 4 17:54:24.483371 containerd[1449]: time="2024-09-04T17:54:24.483165537Z" level=info msg="Forcibly stopping sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\"" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.538 [WARNING][4728] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5658969e-2b8a-4734-8694-fff3696c8a14", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 53, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad", Pod:"csi-node-driver-w4vzr", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.13.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali827b29a1f6d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.538 [INFO][4728] k8s.go 608: Cleaning up netns ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.538 [INFO][4728] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" iface="eth0" netns="" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.538 [INFO][4728] k8s.go 615: Releasing IP address(es) ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.538 [INFO][4728] utils.go 188: Calico CNI releasing IP address ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.572 [INFO][4734] ipam_plugin.go 417: Releasing address using handleID ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.572 [INFO][4734] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.572 [INFO][4734] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.581 [WARNING][4734] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.582 [INFO][4734] ipam_plugin.go 445: Releasing address using workloadID ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" HandleID="k8s-pod-network.7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-csi--node--driver--w4vzr-eth0" Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.584 [INFO][4734] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:24.590616 containerd[1449]: 2024-09-04 17:54:24.588 [INFO][4728] k8s.go 621: Teardown processing complete. ContainerID="7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63" Sep 4 17:54:24.592353 containerd[1449]: time="2024-09-04T17:54:24.591828643Z" level=info msg="TearDown network for sandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\" successfully" Sep 4 17:54:24.597647 containerd[1449]: time="2024-09-04T17:54:24.597601317Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 4 17:54:24.597750 containerd[1449]: time="2024-09-04T17:54:24.597670803Z" level=info msg="RemovePodSandbox \"7a478b48904e4e4845eaf6c9666e218b35b3880141dec1ec5f324d92d7714e63\" returns successfully" Sep 4 17:54:26.130173 containerd[1449]: time="2024-09-04T17:54:26.129858441Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:26.159669 containerd[1449]: time="2024-09-04T17:54:26.159064182Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1: active requests=0, bytes read=12907822" Sep 4 17:54:26.206061 containerd[1449]: time="2024-09-04T17:54:26.206002685Z" level=info msg="ImageCreate event name:\"sha256:d1ca8f023879d2e9a9a7c98dbb3252886c5b7676be9529ddb5200aa2789b233e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:26.238637 containerd[1449]: time="2024-09-04T17:54:26.238512422Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:682cc97e4580d25b7314032c008a552bb05182fac34eba82cc389113c7767076\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:26.240937 containerd[1449]: time="2024-09-04T17:54:26.240530913Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\" with image id \"sha256:d1ca8f023879d2e9a9a7c98dbb3252886c5b7676be9529ddb5200aa2789b233e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:682cc97e4580d25b7314032c008a552bb05182fac34eba82cc389113c7767076\", size \"14400175\" in 2.327580637s" Sep 4 17:54:26.240937 containerd[1449]: time="2024-09-04T17:54:26.240616850Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\" returns image reference \"sha256:d1ca8f023879d2e9a9a7c98dbb3252886c5b7676be9529ddb5200aa2789b233e\"" Sep 4 17:54:26.247266 containerd[1449]: time="2024-09-04T17:54:26.247059201Z" level=info msg="CreateContainer within sandbox \"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 17:54:26.401131 containerd[1449]: time="2024-09-04T17:54:26.400931376Z" level=info msg="CreateContainer within sandbox \"379c30f8e195cb612690ea1ea997519e4e28bc3abddc6d08a950d51c875edbad\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"77ae4cb02ea84c6de6009433009501880ec8c83f15c4ad7965ec499a342ca0d2\"" Sep 4 17:54:26.402476 containerd[1449]: time="2024-09-04T17:54:26.402356836Z" level=info msg="StartContainer for \"77ae4cb02ea84c6de6009433009501880ec8c83f15c4ad7965ec499a342ca0d2\"" Sep 4 17:54:26.407706 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3252888386.mount: Deactivated successfully. Sep 4 17:54:26.478976 systemd[1]: Started cri-containerd-77ae4cb02ea84c6de6009433009501880ec8c83f15c4ad7965ec499a342ca0d2.scope - libcontainer container 77ae4cb02ea84c6de6009433009501880ec8c83f15c4ad7965ec499a342ca0d2. Sep 4 17:54:26.516892 containerd[1449]: time="2024-09-04T17:54:26.516855165Z" level=info msg="StartContainer for \"77ae4cb02ea84c6de6009433009501880ec8c83f15c4ad7965ec499a342ca0d2\" returns successfully" Sep 4 17:54:26.735573 kubelet[2623]: I0904 17:54:26.734863 2623 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 17:54:26.735573 kubelet[2623]: I0904 17:54:26.734957 2623 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 17:54:27.064017 kubelet[2623]: I0904 17:54:27.063744 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-w4vzr" podStartSLOduration=34.804680807 podCreationTimestamp="2024-09-04 17:53:45 +0000 UTC" firstStartedPulling="2024-09-04 17:54:18.982717512 +0000 UTC m=+55.745363567" lastFinishedPulling="2024-09-04 17:54:26.241690919 +0000 UTC m=+63.004337014" observedRunningTime="2024-09-04 17:54:27.063189014 +0000 UTC m=+63.825835109" watchObservedRunningTime="2024-09-04 17:54:27.063654254 +0000 UTC m=+63.826300349" Sep 4 17:54:30.241544 kubelet[2623]: I0904 17:54:30.241496 2623 topology_manager.go:215] "Topology Admit Handler" podUID="2e4d8747-8070-490d-892f-41ec4617d704" podNamespace="calico-apiserver" podName="calico-apiserver-b86bb844c-sq7hw" Sep 4 17:54:30.248593 kubelet[2623]: I0904 17:54:30.248546 2623 topology_manager.go:215] "Topology Admit Handler" podUID="bae89692-f49e-4614-8c76-93ce7a1d597c" podNamespace="calico-apiserver" podName="calico-apiserver-b86bb844c-75d2q" Sep 4 17:54:30.305713 systemd[1]: Created slice kubepods-besteffort-pod2e4d8747_8070_490d_892f_41ec4617d704.slice - libcontainer container kubepods-besteffort-pod2e4d8747_8070_490d_892f_41ec4617d704.slice. Sep 4 17:54:30.310225 systemd[1]: Created slice kubepods-besteffort-podbae89692_f49e_4614_8c76_93ce7a1d597c.slice - libcontainer container kubepods-besteffort-podbae89692_f49e_4614_8c76_93ce7a1d597c.slice. Sep 4 17:54:30.388919 kubelet[2623]: I0904 17:54:30.388871 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bae89692-f49e-4614-8c76-93ce7a1d597c-calico-apiserver-certs\") pod \"calico-apiserver-b86bb844c-75d2q\" (UID: \"bae89692-f49e-4614-8c76-93ce7a1d597c\") " pod="calico-apiserver/calico-apiserver-b86bb844c-75d2q" Sep 4 17:54:30.393496 kubelet[2623]: I0904 17:54:30.393461 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qc8r\" (UniqueName: \"kubernetes.io/projected/bae89692-f49e-4614-8c76-93ce7a1d597c-kube-api-access-2qc8r\") pod \"calico-apiserver-b86bb844c-75d2q\" (UID: \"bae89692-f49e-4614-8c76-93ce7a1d597c\") " pod="calico-apiserver/calico-apiserver-b86bb844c-75d2q" Sep 4 17:54:30.393592 kubelet[2623]: I0904 17:54:30.393572 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxgl\" (UniqueName: \"kubernetes.io/projected/2e4d8747-8070-490d-892f-41ec4617d704-kube-api-access-qsxgl\") pod \"calico-apiserver-b86bb844c-sq7hw\" (UID: \"2e4d8747-8070-490d-892f-41ec4617d704\") " pod="calico-apiserver/calico-apiserver-b86bb844c-sq7hw" Sep 4 17:54:30.393675 kubelet[2623]: I0904 17:54:30.393654 2623 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2e4d8747-8070-490d-892f-41ec4617d704-calico-apiserver-certs\") pod \"calico-apiserver-b86bb844c-sq7hw\" (UID: \"2e4d8747-8070-490d-892f-41ec4617d704\") " pod="calico-apiserver/calico-apiserver-b86bb844c-sq7hw" Sep 4 17:54:30.619259 containerd[1449]: time="2024-09-04T17:54:30.619111744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b86bb844c-75d2q,Uid:bae89692-f49e-4614-8c76-93ce7a1d597c,Namespace:calico-apiserver,Attempt:0,}" Sep 4 17:54:30.620160 containerd[1449]: time="2024-09-04T17:54:30.619748980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b86bb844c-sq7hw,Uid:2e4d8747-8070-490d-892f-41ec4617d704,Namespace:calico-apiserver,Attempt:0,}" Sep 4 17:54:30.853161 systemd-networkd[1348]: cali0da40b41650: Link UP Sep 4 17:54:30.853851 systemd-networkd[1348]: cali0da40b41650: Gained carrier Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.729 [INFO][4817] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0 calico-apiserver-b86bb844c- calico-apiserver 2e4d8747-8070-490d-892f-41ec4617d704 871 0 2024-09-04 17:54:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b86bb844c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4054-1-0-c-33e05803e0.novalocal calico-apiserver-b86bb844c-sq7hw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0da40b41650 [] []}} ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.729 [INFO][4817] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.778 [INFO][4841] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" HandleID="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.799 [INFO][4841] ipam_plugin.go 270: Auto assigning IP ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" HandleID="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318400), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4054-1-0-c-33e05803e0.novalocal", "pod":"calico-apiserver-b86bb844c-sq7hw", "timestamp":"2024-09-04 17:54:30.778787209 +0000 UTC"}, Hostname:"ci-4054-1-0-c-33e05803e0.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.799 [INFO][4841] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.799 [INFO][4841] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.799 [INFO][4841] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4054-1-0-c-33e05803e0.novalocal' Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.802 [INFO][4841] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.808 [INFO][4841] ipam.go 372: Looking up existing affinities for host host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.817 [INFO][4841] ipam.go 489: Trying affinity for 192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.819 [INFO][4841] ipam.go 155: Attempting to load block cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.824 [INFO][4841] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.824 [INFO][4841] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.192/26 handle="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.827 [INFO][4841] ipam.go 1685: Creating new handle: k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0 Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.832 [INFO][4841] ipam.go 1203: Writing block in order to claim IPs block=192.168.13.192/26 handle="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.840 [INFO][4841] ipam.go 1216: Successfully claimed IPs: [192.168.13.197/26] block=192.168.13.192/26 handle="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.840 [INFO][4841] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.197/26] handle="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.840 [INFO][4841] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:30.879407 containerd[1449]: 2024-09-04 17:54:30.840 [INFO][4841] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.13.197/26] IPv6=[] ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" HandleID="k8s-pod-network.38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" Sep 4 17:54:30.882007 containerd[1449]: 2024-09-04 17:54:30.843 [INFO][4817] k8s.go 386: Populated endpoint ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0", GenerateName:"calico-apiserver-b86bb844c-", Namespace:"calico-apiserver", SelfLink:"", UID:"2e4d8747-8070-490d-892f-41ec4617d704", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 54, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b86bb844c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"", Pod:"calico-apiserver-b86bb844c-sq7hw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0da40b41650", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:30.882007 containerd[1449]: 2024-09-04 17:54:30.845 [INFO][4817] k8s.go 387: Calico CNI using IPs: [192.168.13.197/32] ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" Sep 4 17:54:30.882007 containerd[1449]: 2024-09-04 17:54:30.846 [INFO][4817] dataplane_linux.go 68: Setting the host side veth name to cali0da40b41650 ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" Sep 4 17:54:30.882007 containerd[1449]: 2024-09-04 17:54:30.853 [INFO][4817] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" Sep 4 17:54:30.882007 containerd[1449]: 2024-09-04 17:54:30.854 [INFO][4817] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0", GenerateName:"calico-apiserver-b86bb844c-", Namespace:"calico-apiserver", SelfLink:"", UID:"2e4d8747-8070-490d-892f-41ec4617d704", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 54, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b86bb844c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0", Pod:"calico-apiserver-b86bb844c-sq7hw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0da40b41650", MAC:"7e:87:57:45:bf:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:30.882007 containerd[1449]: 2024-09-04 17:54:30.875 [INFO][4817] k8s.go 500: Wrote updated endpoint to datastore ContainerID="38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-sq7hw" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--sq7hw-eth0" Sep 4 17:54:30.921915 systemd-networkd[1348]: calid522882e690: Link UP Sep 4 17:54:30.923755 systemd-networkd[1348]: calid522882e690: Gained carrier Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.728 [INFO][4816] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0 calico-apiserver-b86bb844c- calico-apiserver bae89692-f49e-4614-8c76-93ce7a1d597c 876 0 2024-09-04 17:54:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b86bb844c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4054-1-0-c-33e05803e0.novalocal calico-apiserver-b86bb844c-75d2q eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid522882e690 [] []}} ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.729 [INFO][4816] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.794 [INFO][4840] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" HandleID="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.810 [INFO][4840] ipam_plugin.go 270: Auto assigning IP ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" HandleID="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002efb00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4054-1-0-c-33e05803e0.novalocal", "pod":"calico-apiserver-b86bb844c-75d2q", "timestamp":"2024-09-04 17:54:30.794943931 +0000 UTC"}, Hostname:"ci-4054-1-0-c-33e05803e0.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.810 [INFO][4840] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.841 [INFO][4840] ipam_plugin.go 373: Acquired host-wide IPAM lock. Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.841 [INFO][4840] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4054-1-0-c-33e05803e0.novalocal' Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.844 [INFO][4840] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.861 [INFO][4840] ipam.go 372: Looking up existing affinities for host host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.880 [INFO][4840] ipam.go 489: Trying affinity for 192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.892 [INFO][4840] ipam.go 155: Attempting to load block cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.898 [INFO][4840] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.13.192/26 host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.898 [INFO][4840] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.13.192/26 handle="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.901 [INFO][4840] ipam.go 1685: Creating new handle: k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2 Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.906 [INFO][4840] ipam.go 1203: Writing block in order to claim IPs block=192.168.13.192/26 handle="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.916 [INFO][4840] ipam.go 1216: Successfully claimed IPs: [192.168.13.198/26] block=192.168.13.192/26 handle="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.916 [INFO][4840] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.13.198/26] handle="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" host="ci-4054-1-0-c-33e05803e0.novalocal" Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.916 [INFO][4840] ipam_plugin.go 379: Released host-wide IPAM lock. Sep 4 17:54:30.958994 containerd[1449]: 2024-09-04 17:54:30.916 [INFO][4840] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.13.198/26] IPv6=[] ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" HandleID="k8s-pod-network.2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Workload="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" Sep 4 17:54:30.962701 containerd[1449]: 2024-09-04 17:54:30.919 [INFO][4816] k8s.go 386: Populated endpoint ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0", GenerateName:"calico-apiserver-b86bb844c-", Namespace:"calico-apiserver", SelfLink:"", UID:"bae89692-f49e-4614-8c76-93ce7a1d597c", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 54, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b86bb844c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"", Pod:"calico-apiserver-b86bb844c-75d2q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid522882e690", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:30.962701 containerd[1449]: 2024-09-04 17:54:30.919 [INFO][4816] k8s.go 387: Calico CNI using IPs: [192.168.13.198/32] ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" Sep 4 17:54:30.962701 containerd[1449]: 2024-09-04 17:54:30.919 [INFO][4816] dataplane_linux.go 68: Setting the host side veth name to calid522882e690 ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" Sep 4 17:54:30.962701 containerd[1449]: 2024-09-04 17:54:30.922 [INFO][4816] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" Sep 4 17:54:30.962701 containerd[1449]: 2024-09-04 17:54:30.923 [INFO][4816] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0", GenerateName:"calico-apiserver-b86bb844c-", Namespace:"calico-apiserver", SelfLink:"", UID:"bae89692-f49e-4614-8c76-93ce7a1d597c", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2024, time.September, 4, 17, 54, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b86bb844c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4054-1-0-c-33e05803e0.novalocal", ContainerID:"2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2", Pod:"calico-apiserver-b86bb844c-75d2q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.13.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid522882e690", MAC:"fa:0c:02:e2:b8:06", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Sep 4 17:54:30.962701 containerd[1449]: 2024-09-04 17:54:30.955 [INFO][4816] k8s.go 500: Wrote updated endpoint to datastore ContainerID="2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2" Namespace="calico-apiserver" Pod="calico-apiserver-b86bb844c-75d2q" WorkloadEndpoint="ci--4054--1--0--c--33e05803e0.novalocal-k8s-calico--apiserver--b86bb844c--75d2q-eth0" Sep 4 17:54:30.997285 containerd[1449]: time="2024-09-04T17:54:30.996709216Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:54:30.997285 containerd[1449]: time="2024-09-04T17:54:30.996952641Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:54:30.997285 containerd[1449]: time="2024-09-04T17:54:30.996971796Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:30.997840 containerd[1449]: time="2024-09-04T17:54:30.997486958Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:31.011019 containerd[1449]: time="2024-09-04T17:54:31.008063314Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 4 17:54:31.011019 containerd[1449]: time="2024-09-04T17:54:31.008130317Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 4 17:54:31.011019 containerd[1449]: time="2024-09-04T17:54:31.008156124Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:31.011019 containerd[1449]: time="2024-09-04T17:54:31.008255295Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 4 17:54:31.035995 systemd[1]: Started cri-containerd-38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0.scope - libcontainer container 38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0. Sep 4 17:54:31.046751 systemd[1]: Started cri-containerd-2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2.scope - libcontainer container 2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2. Sep 4 17:54:31.139977 containerd[1449]: time="2024-09-04T17:54:31.139031805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b86bb844c-75d2q,Uid:bae89692-f49e-4614-8c76-93ce7a1d597c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2\"" Sep 4 17:54:31.144954 containerd[1449]: time="2024-09-04T17:54:31.144920849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\"" Sep 4 17:54:31.147055 containerd[1449]: time="2024-09-04T17:54:31.147017626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b86bb844c-sq7hw,Uid:2e4d8747-8070-490d-892f-41ec4617d704,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0\"" Sep 4 17:54:32.042475 systemd-networkd[1348]: cali0da40b41650: Gained IPv6LL Sep 4 17:54:32.105097 systemd-networkd[1348]: calid522882e690: Gained IPv6LL Sep 4 17:54:35.073782 containerd[1449]: time="2024-09-04T17:54:35.073427993Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:35.076150 containerd[1449]: time="2024-09-04T17:54:35.075985612Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.28.1: active requests=0, bytes read=40419849" Sep 4 17:54:35.077701 containerd[1449]: time="2024-09-04T17:54:35.077660543Z" level=info msg="ImageCreate event name:\"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:35.081022 containerd[1449]: time="2024-09-04T17:54:35.080172237Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b4ee1aa27bdeddc34dd200145eb033b716cf598570206c96693a35a317ab4f1e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:35.081022 containerd[1449]: time="2024-09-04T17:54:35.080891326Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" with image id \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b4ee1aa27bdeddc34dd200145eb033b716cf598570206c96693a35a317ab4f1e\", size \"41912266\" in 3.934877356s" Sep 4 17:54:35.081022 containerd[1449]: time="2024-09-04T17:54:35.080918206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" returns image reference \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\"" Sep 4 17:54:35.082614 containerd[1449]: time="2024-09-04T17:54:35.082597564Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\"" Sep 4 17:54:35.083812 containerd[1449]: time="2024-09-04T17:54:35.083754555Z" level=info msg="CreateContainer within sandbox \"2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 17:54:35.104607 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3009119685.mount: Deactivated successfully. Sep 4 17:54:35.114231 containerd[1449]: time="2024-09-04T17:54:35.114117634Z" level=info msg="CreateContainer within sandbox \"2240749fc2723b6152771003cb7308d7d614364b41deacab48d087bf9a2a66a2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"85c6a9ffa47886f7c3c0551f958651a5bb4f118570e8da8b3d1fa32834d89489\"" Sep 4 17:54:35.115097 containerd[1449]: time="2024-09-04T17:54:35.115044582Z" level=info msg="StartContainer for \"85c6a9ffa47886f7c3c0551f958651a5bb4f118570e8da8b3d1fa32834d89489\"" Sep 4 17:54:35.160074 systemd[1]: Started cri-containerd-85c6a9ffa47886f7c3c0551f958651a5bb4f118570e8da8b3d1fa32834d89489.scope - libcontainer container 85c6a9ffa47886f7c3c0551f958651a5bb4f118570e8da8b3d1fa32834d89489. Sep 4 17:54:35.283916 containerd[1449]: time="2024-09-04T17:54:35.283868395Z" level=info msg="StartContainer for \"85c6a9ffa47886f7c3c0551f958651a5bb4f118570e8da8b3d1fa32834d89489\" returns successfully" Sep 4 17:54:35.496642 containerd[1449]: time="2024-09-04T17:54:35.496590869Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:54:35.498686 containerd[1449]: time="2024-09-04T17:54:35.498645686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.28.1: active requests=0, bytes read=77" Sep 4 17:54:35.500844 containerd[1449]: time="2024-09-04T17:54:35.500777475Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" with image id \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b4ee1aa27bdeddc34dd200145eb033b716cf598570206c96693a35a317ab4f1e\", size \"41912266\" in 417.502859ms" Sep 4 17:54:35.500899 containerd[1449]: time="2024-09-04T17:54:35.500849417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" returns image reference \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\"" Sep 4 17:54:35.502969 containerd[1449]: time="2024-09-04T17:54:35.502938717Z" level=info msg="CreateContainer within sandbox \"38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 17:54:35.526219 containerd[1449]: time="2024-09-04T17:54:35.526134234Z" level=info msg="CreateContainer within sandbox \"38dd5f73676ca7e1f8a5a42ec3814d8db2320512c49acff7723b1f148b463cb0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4572312df71380f902bfa76c112401cb060f6d2c63826430081d580a0c4d6454\"" Sep 4 17:54:35.527338 containerd[1449]: time="2024-09-04T17:54:35.526908323Z" level=info msg="StartContainer for \"4572312df71380f902bfa76c112401cb060f6d2c63826430081d580a0c4d6454\"" Sep 4 17:54:35.568966 systemd[1]: Started cri-containerd-4572312df71380f902bfa76c112401cb060f6d2c63826430081d580a0c4d6454.scope - libcontainer container 4572312df71380f902bfa76c112401cb060f6d2c63826430081d580a0c4d6454. Sep 4 17:54:35.640748 containerd[1449]: time="2024-09-04T17:54:35.640683649Z" level=info msg="StartContainer for \"4572312df71380f902bfa76c112401cb060f6d2c63826430081d580a0c4d6454\" returns successfully" Sep 4 17:54:36.113919 systemd[1]: Started sshd@9-172.24.4.122:22-172.24.4.1:38092.service - OpenSSH per-connection server daemon (172.24.4.1:38092). Sep 4 17:54:36.147106 kubelet[2623]: I0904 17:54:36.146756 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b86bb844c-75d2q" podStartSLOduration=2.209247078 podCreationTimestamp="2024-09-04 17:54:30 +0000 UTC" firstStartedPulling="2024-09-04 17:54:31.144007929 +0000 UTC m=+67.906653984" lastFinishedPulling="2024-09-04 17:54:35.081479444 +0000 UTC m=+71.844125509" observedRunningTime="2024-09-04 17:54:36.143349072 +0000 UTC m=+72.905995127" watchObservedRunningTime="2024-09-04 17:54:36.146718603 +0000 UTC m=+72.909364648" Sep 4 17:54:37.169977 kubelet[2623]: I0904 17:54:37.169909 2623 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b86bb844c-sq7hw" podStartSLOduration=2.8174942290000002 podCreationTimestamp="2024-09-04 17:54:30 +0000 UTC" firstStartedPulling="2024-09-04 17:54:31.148663527 +0000 UTC m=+67.911309582" lastFinishedPulling="2024-09-04 17:54:35.501040447 +0000 UTC m=+72.263686502" observedRunningTime="2024-09-04 17:54:36.166523891 +0000 UTC m=+72.929169956" watchObservedRunningTime="2024-09-04 17:54:37.169871149 +0000 UTC m=+73.932517204" Sep 4 17:54:37.591026 sshd[5079]: Accepted publickey for core from 172.24.4.1 port 38092 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:54:37.597594 sshd[5079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:54:37.610680 systemd-logind[1429]: New session 12 of user core. Sep 4 17:54:37.630239 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 17:54:39.553055 sshd[5079]: pam_unix(sshd:session): session closed for user core Sep 4 17:54:39.559859 systemd-logind[1429]: Session 12 logged out. Waiting for processes to exit. Sep 4 17:54:39.570088 systemd[1]: sshd@9-172.24.4.122:22-172.24.4.1:38092.service: Deactivated successfully. Sep 4 17:54:39.572589 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 17:54:39.574287 systemd-logind[1429]: Removed session 12. Sep 4 17:54:44.580950 systemd[1]: Started sshd@10-172.24.4.122:22-172.24.4.1:43002.service - OpenSSH per-connection server daemon (172.24.4.1:43002). Sep 4 17:54:46.075913 sshd[5108]: Accepted publickey for core from 172.24.4.1 port 43002 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:54:46.079040 sshd[5108]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:54:46.090374 systemd-logind[1429]: New session 13 of user core. Sep 4 17:54:46.100212 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 17:54:46.890243 sshd[5108]: pam_unix(sshd:session): session closed for user core Sep 4 17:54:46.903827 systemd[1]: sshd@10-172.24.4.122:22-172.24.4.1:43002.service: Deactivated successfully. Sep 4 17:54:46.908365 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 17:54:46.910039 systemd-logind[1429]: Session 13 logged out. Waiting for processes to exit. Sep 4 17:54:46.912613 systemd-logind[1429]: Removed session 13. Sep 4 17:54:51.914495 systemd[1]: Started sshd@11-172.24.4.122:22-172.24.4.1:43006.service - OpenSSH per-connection server daemon (172.24.4.1:43006). Sep 4 17:54:53.279729 sshd[5150]: Accepted publickey for core from 172.24.4.1 port 43006 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:54:53.283386 sshd[5150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:54:53.293617 systemd-logind[1429]: New session 14 of user core. Sep 4 17:54:53.301150 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 17:54:54.202853 sshd[5150]: pam_unix(sshd:session): session closed for user core Sep 4 17:54:54.219001 systemd[1]: sshd@11-172.24.4.122:22-172.24.4.1:43006.service: Deactivated successfully. Sep 4 17:54:54.223985 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 17:54:54.231090 systemd-logind[1429]: Session 14 logged out. Waiting for processes to exit. Sep 4 17:54:54.240394 systemd[1]: Started sshd@12-172.24.4.122:22-172.24.4.1:43014.service - OpenSSH per-connection server daemon (172.24.4.1:43014). Sep 4 17:54:54.241940 systemd-logind[1429]: Removed session 14. Sep 4 17:54:55.761078 sshd[5170]: Accepted publickey for core from 172.24.4.1 port 43014 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:54:55.765504 sshd[5170]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:54:55.780166 systemd-logind[1429]: New session 15 of user core. Sep 4 17:54:55.787204 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 17:54:57.630701 systemd[1]: Started sshd@13-172.24.4.122:22-172.24.4.1:57006.service - OpenSSH per-connection server daemon (172.24.4.1:57006). Sep 4 17:54:57.718761 sshd[5170]: pam_unix(sshd:session): session closed for user core Sep 4 17:54:57.736602 systemd[1]: sshd@12-172.24.4.122:22-172.24.4.1:43014.service: Deactivated successfully. Sep 4 17:54:57.737083 systemd-logind[1429]: Session 15 logged out. Waiting for processes to exit. Sep 4 17:54:57.741944 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 17:54:57.753124 systemd-logind[1429]: Removed session 15. Sep 4 17:54:58.960942 sshd[5205]: Accepted publickey for core from 172.24.4.1 port 57006 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:54:58.986764 sshd[5205]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:54:59.007276 systemd-logind[1429]: New session 16 of user core. Sep 4 17:54:59.012304 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 17:54:59.776484 sshd[5205]: pam_unix(sshd:session): session closed for user core Sep 4 17:54:59.783081 systemd[1]: sshd@13-172.24.4.122:22-172.24.4.1:57006.service: Deactivated successfully. Sep 4 17:54:59.786989 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 17:54:59.788529 systemd-logind[1429]: Session 16 logged out. Waiting for processes to exit. Sep 4 17:54:59.790012 systemd-logind[1429]: Removed session 16. Sep 4 17:55:04.788782 systemd[1]: Started sshd@14-172.24.4.122:22-172.24.4.1:58906.service - OpenSSH per-connection server daemon (172.24.4.1:58906). Sep 4 17:55:06.280913 sshd[5243]: Accepted publickey for core from 172.24.4.1 port 58906 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:06.284310 sshd[5243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:06.295181 systemd-logind[1429]: New session 17 of user core. Sep 4 17:55:06.303185 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 17:55:07.137918 sshd[5243]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:07.145659 systemd-logind[1429]: Session 17 logged out. Waiting for processes to exit. Sep 4 17:55:07.147420 systemd[1]: sshd@14-172.24.4.122:22-172.24.4.1:58906.service: Deactivated successfully. Sep 4 17:55:07.153937 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 17:55:07.156680 systemd-logind[1429]: Removed session 17. Sep 4 17:55:12.159429 systemd[1]: Started sshd@15-172.24.4.122:22-172.24.4.1:58922.service - OpenSSH per-connection server daemon (172.24.4.1:58922). Sep 4 17:55:13.651231 sshd[5266]: Accepted publickey for core from 172.24.4.1 port 58922 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:13.653465 sshd[5266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:13.662115 systemd-logind[1429]: New session 18 of user core. Sep 4 17:55:13.669106 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 17:55:14.601631 sshd[5266]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:14.726119 systemd[1]: sshd@15-172.24.4.122:22-172.24.4.1:58922.service: Deactivated successfully. Sep 4 17:55:14.730388 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 17:55:14.734732 systemd-logind[1429]: Session 18 logged out. Waiting for processes to exit. Sep 4 17:55:14.737340 systemd-logind[1429]: Removed session 18. Sep 4 17:55:19.624383 systemd[1]: Started sshd@16-172.24.4.122:22-172.24.4.1:47302.service - OpenSSH per-connection server daemon (172.24.4.1:47302). Sep 4 17:55:21.000373 sshd[5285]: Accepted publickey for core from 172.24.4.1 port 47302 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:21.003369 sshd[5285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:21.010646 systemd-logind[1429]: New session 19 of user core. Sep 4 17:55:21.014203 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 17:55:21.920334 sshd[5285]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:21.944516 systemd[1]: Started sshd@17-172.24.4.122:22-172.24.4.1:47306.service - OpenSSH per-connection server daemon (172.24.4.1:47306). Sep 4 17:55:21.947601 systemd[1]: sshd@16-172.24.4.122:22-172.24.4.1:47302.service: Deactivated successfully. Sep 4 17:55:21.952377 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 17:55:21.959143 systemd-logind[1429]: Session 19 logged out. Waiting for processes to exit. Sep 4 17:55:21.967023 systemd-logind[1429]: Removed session 19. Sep 4 17:55:23.390179 sshd[5296]: Accepted publickey for core from 172.24.4.1 port 47306 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:23.391727 sshd[5296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:23.399581 systemd-logind[1429]: New session 20 of user core. Sep 4 17:55:23.409221 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 17:55:24.772512 sshd[5296]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:24.792533 systemd[1]: Started sshd@18-172.24.4.122:22-172.24.4.1:49614.service - OpenSSH per-connection server daemon (172.24.4.1:49614). Sep 4 17:55:24.794744 systemd[1]: sshd@17-172.24.4.122:22-172.24.4.1:47306.service: Deactivated successfully. Sep 4 17:55:24.800092 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 17:55:24.806193 systemd-logind[1429]: Session 20 logged out. Waiting for processes to exit. Sep 4 17:55:24.814027 systemd-logind[1429]: Removed session 20. Sep 4 17:55:26.174733 sshd[5309]: Accepted publickey for core from 172.24.4.1 port 49614 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:26.178461 sshd[5309]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:26.190907 systemd-logind[1429]: New session 21 of user core. Sep 4 17:55:26.197127 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 17:55:28.519900 sshd[5309]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:28.547724 systemd[1]: sshd@18-172.24.4.122:22-172.24.4.1:49614.service: Deactivated successfully. Sep 4 17:55:28.555075 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 17:55:28.556431 systemd[1]: session-21.scope: Consumed 1.008s CPU time. Sep 4 17:55:28.558985 systemd-logind[1429]: Session 21 logged out. Waiting for processes to exit. Sep 4 17:55:28.570388 systemd[1]: Started sshd@19-172.24.4.122:22-172.24.4.1:49622.service - OpenSSH per-connection server daemon (172.24.4.1:49622). Sep 4 17:55:28.572009 systemd-logind[1429]: Removed session 21. Sep 4 17:55:30.083081 sshd[5356]: Accepted publickey for core from 172.24.4.1 port 49622 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:30.087347 sshd[5356]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:30.098004 systemd-logind[1429]: New session 22 of user core. Sep 4 17:55:30.105996 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 17:55:32.822254 sshd[5356]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:32.832333 systemd[1]: Started sshd@20-172.24.4.122:22-172.24.4.1:49626.service - OpenSSH per-connection server daemon (172.24.4.1:49626). Sep 4 17:55:32.833443 systemd[1]: sshd@19-172.24.4.122:22-172.24.4.1:49622.service: Deactivated successfully. Sep 4 17:55:32.838380 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 17:55:32.842501 systemd-logind[1429]: Session 22 logged out. Waiting for processes to exit. Sep 4 17:55:32.845148 systemd-logind[1429]: Removed session 22. Sep 4 17:55:34.090508 sshd[5385]: Accepted publickey for core from 172.24.4.1 port 49626 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:34.093439 sshd[5385]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:34.103989 systemd-logind[1429]: New session 23 of user core. Sep 4 17:55:34.114106 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 17:55:34.868312 sshd[5385]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:34.873146 systemd[1]: sshd@20-172.24.4.122:22-172.24.4.1:49626.service: Deactivated successfully. Sep 4 17:55:34.876402 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 17:55:34.880275 systemd-logind[1429]: Session 23 logged out. Waiting for processes to exit. Sep 4 17:55:34.881958 systemd-logind[1429]: Removed session 23. Sep 4 17:55:39.898412 systemd[1]: Started sshd@21-172.24.4.122:22-172.24.4.1:53384.service - OpenSSH per-connection server daemon (172.24.4.1:53384). Sep 4 17:55:41.256213 sshd[5416]: Accepted publickey for core from 172.24.4.1 port 53384 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:41.259338 sshd[5416]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:41.269687 systemd-logind[1429]: New session 24 of user core. Sep 4 17:55:41.278165 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 17:55:42.036998 sshd[5416]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:42.044784 systemd-logind[1429]: Session 24 logged out. Waiting for processes to exit. Sep 4 17:55:42.045491 systemd[1]: sshd@21-172.24.4.122:22-172.24.4.1:53384.service: Deactivated successfully. Sep 4 17:55:42.049151 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 17:55:42.054412 systemd-logind[1429]: Removed session 24. Sep 4 17:55:47.053207 systemd[1]: Started sshd@22-172.24.4.122:22-172.24.4.1:40694.service - OpenSSH per-connection server daemon (172.24.4.1:40694). Sep 4 17:55:48.637503 sshd[5448]: Accepted publickey for core from 172.24.4.1 port 40694 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:48.639837 sshd[5448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:48.649354 systemd-logind[1429]: New session 25 of user core. Sep 4 17:55:48.653945 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 4 17:55:49.347647 sshd[5448]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:49.356411 systemd[1]: sshd@22-172.24.4.122:22-172.24.4.1:40694.service: Deactivated successfully. Sep 4 17:55:49.362114 systemd[1]: session-25.scope: Deactivated successfully. Sep 4 17:55:49.364226 systemd-logind[1429]: Session 25 logged out. Waiting for processes to exit. Sep 4 17:55:49.367348 systemd-logind[1429]: Removed session 25. Sep 4 17:55:54.373512 systemd[1]: Started sshd@23-172.24.4.122:22-172.24.4.1:40702.service - OpenSSH per-connection server daemon (172.24.4.1:40702). Sep 4 17:55:55.779714 sshd[5479]: Accepted publickey for core from 172.24.4.1 port 40702 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:55:55.783273 sshd[5479]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:55:55.795869 systemd-logind[1429]: New session 26 of user core. Sep 4 17:55:55.802158 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 4 17:55:56.584301 sshd[5479]: pam_unix(sshd:session): session closed for user core Sep 4 17:55:56.589753 systemd[1]: sshd@23-172.24.4.122:22-172.24.4.1:40702.service: Deactivated successfully. Sep 4 17:55:56.591667 systemd[1]: session-26.scope: Deactivated successfully. Sep 4 17:55:56.595620 systemd-logind[1429]: Session 26 logged out. Waiting for processes to exit. Sep 4 17:55:56.600414 systemd-logind[1429]: Removed session 26. Sep 4 17:56:01.612492 systemd[1]: Started sshd@24-172.24.4.122:22-172.24.4.1:36240.service - OpenSSH per-connection server daemon (172.24.4.1:36240). Sep 4 17:56:03.210682 sshd[5531]: Accepted publickey for core from 172.24.4.1 port 36240 ssh2: RSA SHA256:JnA7Fh8lVkr6ENifNOXj431OPLJBOL+/PI8dMas4Eok Sep 4 17:56:03.213220 sshd[5531]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:56:03.225160 systemd-logind[1429]: New session 27 of user core. Sep 4 17:56:03.235073 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 4 17:56:04.061335 sshd[5531]: pam_unix(sshd:session): session closed for user core Sep 4 17:56:04.067100 systemd[1]: sshd@24-172.24.4.122:22-172.24.4.1:36240.service: Deactivated successfully. Sep 4 17:56:04.072458 systemd[1]: session-27.scope: Deactivated successfully. Sep 4 17:56:04.076287 systemd-logind[1429]: Session 27 logged out. Waiting for processes to exit. Sep 4 17:56:04.078062 systemd-logind[1429]: Removed session 27.