Mar 20 22:16:37.072688 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 20 19:36:47 -00 2025 Mar 20 22:16:37.072722 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 22:16:37.072734 kernel: BIOS-provided physical RAM map: Mar 20 22:16:37.072743 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 20 22:16:37.072751 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 20 22:16:37.072777 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 20 22:16:37.072788 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable Mar 20 22:16:37.072796 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved Mar 20 22:16:37.072805 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 20 22:16:37.072813 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 20 22:16:37.072822 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable Mar 20 22:16:37.072830 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 20 22:16:37.072838 kernel: NX (Execute Disable) protection: active Mar 20 22:16:37.072847 kernel: APIC: Static calls initialized Mar 20 22:16:37.072860 kernel: SMBIOS 3.0.0 present. Mar 20 22:16:37.072869 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 Mar 20 22:16:37.072878 kernel: Hypervisor detected: KVM Mar 20 22:16:37.072886 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 20 22:16:37.072895 kernel: kvm-clock: using sched offset of 3804647320 cycles Mar 20 22:16:37.072904 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 20 22:16:37.072916 kernel: tsc: Detected 1996.249 MHz processor Mar 20 22:16:37.072927 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 20 22:16:37.072940 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 20 22:16:37.072952 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 Mar 20 22:16:37.072964 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 20 22:16:37.072976 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 20 22:16:37.072987 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 Mar 20 22:16:37.072999 kernel: ACPI: Early table checksum verification disabled Mar 20 22:16:37.073015 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) Mar 20 22:16:37.073024 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:16:37.073034 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:16:37.073043 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:16:37.073053 kernel: ACPI: FACS 0x00000000BFFE0000 000040 Mar 20 22:16:37.073062 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:16:37.073071 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:16:37.073080 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] Mar 20 22:16:37.073089 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] Mar 20 22:16:37.073100 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] Mar 20 22:16:37.073110 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] Mar 20 22:16:37.073119 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] Mar 20 22:16:37.073132 kernel: No NUMA configuration found Mar 20 22:16:37.073142 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] Mar 20 22:16:37.073151 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] Mar 20 22:16:37.073161 kernel: Zone ranges: Mar 20 22:16:37.073173 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 20 22:16:37.073182 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 20 22:16:37.073192 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] Mar 20 22:16:37.073201 kernel: Movable zone start for each node Mar 20 22:16:37.073211 kernel: Early memory node ranges Mar 20 22:16:37.073220 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 20 22:16:37.073230 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] Mar 20 22:16:37.073239 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] Mar 20 22:16:37.073251 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] Mar 20 22:16:37.073261 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 20 22:16:37.073270 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 20 22:16:37.073280 kernel: On node 0, zone Normal: 35 pages in unavailable ranges Mar 20 22:16:37.073289 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 20 22:16:37.073299 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 20 22:16:37.073309 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 20 22:16:37.073318 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 20 22:16:37.073328 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 20 22:16:37.073339 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 20 22:16:37.073349 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 20 22:16:37.073359 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 20 22:16:37.073368 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 20 22:16:37.073378 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 20 22:16:37.073387 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 20 22:16:37.073397 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Mar 20 22:16:37.073406 kernel: Booting paravirtualized kernel on KVM Mar 20 22:16:37.073416 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 20 22:16:37.073428 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 20 22:16:37.073437 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 20 22:16:37.073445 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 20 22:16:37.073454 kernel: pcpu-alloc: [0] 0 1 Mar 20 22:16:37.073462 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 20 22:16:37.073472 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 22:16:37.073481 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 20 22:16:37.073490 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 20 22:16:37.073500 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 20 22:16:37.073509 kernel: Fallback order for Node 0: 0 Mar 20 22:16:37.073517 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 Mar 20 22:16:37.073526 kernel: Policy zone: Normal Mar 20 22:16:37.073534 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 20 22:16:37.073543 kernel: software IO TLB: area num 2. Mar 20 22:16:37.073552 kernel: Memory: 3962108K/4193772K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 231404K reserved, 0K cma-reserved) Mar 20 22:16:37.073560 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 20 22:16:37.073569 kernel: ftrace: allocating 37985 entries in 149 pages Mar 20 22:16:37.073579 kernel: ftrace: allocated 149 pages with 4 groups Mar 20 22:16:37.073587 kernel: Dynamic Preempt: voluntary Mar 20 22:16:37.073596 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 20 22:16:37.073605 kernel: rcu: RCU event tracing is enabled. Mar 20 22:16:37.073614 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 20 22:16:37.073623 kernel: Trampoline variant of Tasks RCU enabled. Mar 20 22:16:37.073632 kernel: Rude variant of Tasks RCU enabled. Mar 20 22:16:37.073640 kernel: Tracing variant of Tasks RCU enabled. Mar 20 22:16:37.073648 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 20 22:16:37.073659 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 20 22:16:37.073667 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 20 22:16:37.073676 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 20 22:16:37.073684 kernel: Console: colour VGA+ 80x25 Mar 20 22:16:37.073692 kernel: printk: console [tty0] enabled Mar 20 22:16:37.073701 kernel: printk: console [ttyS0] enabled Mar 20 22:16:37.073709 kernel: ACPI: Core revision 20230628 Mar 20 22:16:37.073718 kernel: APIC: Switch to symmetric I/O mode setup Mar 20 22:16:37.073726 kernel: x2apic enabled Mar 20 22:16:37.073737 kernel: APIC: Switched APIC routing to: physical x2apic Mar 20 22:16:37.073745 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 20 22:16:37.073754 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 20 22:16:37.075815 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Mar 20 22:16:37.075826 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 20 22:16:37.075835 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 20 22:16:37.075844 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 20 22:16:37.075852 kernel: Spectre V2 : Mitigation: Retpolines Mar 20 22:16:37.075863 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 20 22:16:37.075876 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 20 22:16:37.075886 kernel: Speculative Store Bypass: Vulnerable Mar 20 22:16:37.075895 kernel: x86/fpu: x87 FPU will use FXSAVE Mar 20 22:16:37.075904 kernel: Freeing SMP alternatives memory: 32K Mar 20 22:16:37.075920 kernel: pid_max: default: 32768 minimum: 301 Mar 20 22:16:37.075932 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 20 22:16:37.075942 kernel: landlock: Up and running. Mar 20 22:16:37.075952 kernel: SELinux: Initializing. Mar 20 22:16:37.075961 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 22:16:37.075971 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 22:16:37.075981 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Mar 20 22:16:37.075991 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 22:16:37.076003 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 22:16:37.076013 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 22:16:37.076023 kernel: Performance Events: AMD PMU driver. Mar 20 22:16:37.076032 kernel: ... version: 0 Mar 20 22:16:37.076042 kernel: ... bit width: 48 Mar 20 22:16:37.076054 kernel: ... generic registers: 4 Mar 20 22:16:37.076064 kernel: ... value mask: 0000ffffffffffff Mar 20 22:16:37.076073 kernel: ... max period: 00007fffffffffff Mar 20 22:16:37.076083 kernel: ... fixed-purpose events: 0 Mar 20 22:16:37.076092 kernel: ... event mask: 000000000000000f Mar 20 22:16:37.076102 kernel: signal: max sigframe size: 1440 Mar 20 22:16:37.076111 kernel: rcu: Hierarchical SRCU implementation. Mar 20 22:16:37.076121 kernel: rcu: Max phase no-delay instances is 400. Mar 20 22:16:37.076131 kernel: smp: Bringing up secondary CPUs ... Mar 20 22:16:37.076142 kernel: smpboot: x86: Booting SMP configuration: Mar 20 22:16:37.076152 kernel: .... node #0, CPUs: #1 Mar 20 22:16:37.076162 kernel: smp: Brought up 1 node, 2 CPUs Mar 20 22:16:37.076171 kernel: smpboot: Max logical packages: 2 Mar 20 22:16:37.076181 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Mar 20 22:16:37.076191 kernel: devtmpfs: initialized Mar 20 22:16:37.076201 kernel: x86/mm: Memory block size: 128MB Mar 20 22:16:37.076210 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 20 22:16:37.076220 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 20 22:16:37.076231 kernel: pinctrl core: initialized pinctrl subsystem Mar 20 22:16:37.076241 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 20 22:16:37.076251 kernel: audit: initializing netlink subsys (disabled) Mar 20 22:16:37.076260 kernel: audit: type=2000 audit(1742508996.437:1): state=initialized audit_enabled=0 res=1 Mar 20 22:16:37.076270 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 20 22:16:37.076280 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 20 22:16:37.076289 kernel: cpuidle: using governor menu Mar 20 22:16:37.076299 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 20 22:16:37.076309 kernel: dca service started, version 1.12.1 Mar 20 22:16:37.076320 kernel: PCI: Using configuration type 1 for base access Mar 20 22:16:37.076330 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 20 22:16:37.076340 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 20 22:16:37.076350 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 20 22:16:37.076359 kernel: ACPI: Added _OSI(Module Device) Mar 20 22:16:37.076369 kernel: ACPI: Added _OSI(Processor Device) Mar 20 22:16:37.076378 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 20 22:16:37.076388 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 20 22:16:37.076398 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 20 22:16:37.076409 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 20 22:16:37.076419 kernel: ACPI: Interpreter enabled Mar 20 22:16:37.076428 kernel: ACPI: PM: (supports S0 S3 S5) Mar 20 22:16:37.076438 kernel: ACPI: Using IOAPIC for interrupt routing Mar 20 22:16:37.076447 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 20 22:16:37.076457 kernel: PCI: Using E820 reservations for host bridge windows Mar 20 22:16:37.076467 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Mar 20 22:16:37.076476 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 20 22:16:37.076627 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Mar 20 22:16:37.076738 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Mar 20 22:16:37.079714 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Mar 20 22:16:37.079735 kernel: acpiphp: Slot [3] registered Mar 20 22:16:37.079745 kernel: acpiphp: Slot [4] registered Mar 20 22:16:37.079754 kernel: acpiphp: Slot [5] registered Mar 20 22:16:37.079780 kernel: acpiphp: Slot [6] registered Mar 20 22:16:37.079790 kernel: acpiphp: Slot [7] registered Mar 20 22:16:37.079800 kernel: acpiphp: Slot [8] registered Mar 20 22:16:37.079815 kernel: acpiphp: Slot [9] registered Mar 20 22:16:37.079825 kernel: acpiphp: Slot [10] registered Mar 20 22:16:37.079834 kernel: acpiphp: Slot [11] registered Mar 20 22:16:37.079844 kernel: acpiphp: Slot [12] registered Mar 20 22:16:37.079853 kernel: acpiphp: Slot [13] registered Mar 20 22:16:37.079863 kernel: acpiphp: Slot [14] registered Mar 20 22:16:37.079873 kernel: acpiphp: Slot [15] registered Mar 20 22:16:37.079882 kernel: acpiphp: Slot [16] registered Mar 20 22:16:37.079892 kernel: acpiphp: Slot [17] registered Mar 20 22:16:37.079903 kernel: acpiphp: Slot [18] registered Mar 20 22:16:37.079912 kernel: acpiphp: Slot [19] registered Mar 20 22:16:37.079922 kernel: acpiphp: Slot [20] registered Mar 20 22:16:37.079931 kernel: acpiphp: Slot [21] registered Mar 20 22:16:37.079941 kernel: acpiphp: Slot [22] registered Mar 20 22:16:37.079950 kernel: acpiphp: Slot [23] registered Mar 20 22:16:37.079960 kernel: acpiphp: Slot [24] registered Mar 20 22:16:37.079970 kernel: acpiphp: Slot [25] registered Mar 20 22:16:37.079979 kernel: acpiphp: Slot [26] registered Mar 20 22:16:37.079989 kernel: acpiphp: Slot [27] registered Mar 20 22:16:37.080000 kernel: acpiphp: Slot [28] registered Mar 20 22:16:37.080010 kernel: acpiphp: Slot [29] registered Mar 20 22:16:37.080019 kernel: acpiphp: Slot [30] registered Mar 20 22:16:37.080030 kernel: acpiphp: Slot [31] registered Mar 20 22:16:37.080041 kernel: PCI host bridge to bus 0000:00 Mar 20 22:16:37.080142 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 20 22:16:37.080229 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 20 22:16:37.080314 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 20 22:16:37.080404 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 20 22:16:37.080513 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] Mar 20 22:16:37.080597 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 20 22:16:37.080714 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Mar 20 22:16:37.080923 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Mar 20 22:16:37.081034 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Mar 20 22:16:37.081141 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Mar 20 22:16:37.081241 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 20 22:16:37.081341 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 20 22:16:37.081440 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 20 22:16:37.081539 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 20 22:16:37.081645 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Mar 20 22:16:37.081747 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Mar 20 22:16:37.082121 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Mar 20 22:16:37.082229 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Mar 20 22:16:37.082325 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Mar 20 22:16:37.082420 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 20 22:16:37.082514 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Mar 20 22:16:37.082608 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Mar 20 22:16:37.082709 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 20 22:16:37.083865 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 20 22:16:37.083969 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Mar 20 22:16:37.084062 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Mar 20 22:16:37.084155 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] Mar 20 22:16:37.084249 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Mar 20 22:16:37.084349 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 20 22:16:37.084450 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 20 22:16:37.084544 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Mar 20 22:16:37.084639 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] Mar 20 22:16:37.084741 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Mar 20 22:16:37.085878 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Mar 20 22:16:37.085977 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] Mar 20 22:16:37.086082 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Mar 20 22:16:37.086191 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Mar 20 22:16:37.086291 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] Mar 20 22:16:37.086390 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] Mar 20 22:16:37.086405 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 20 22:16:37.086415 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 20 22:16:37.086425 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 20 22:16:37.086435 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 20 22:16:37.086445 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Mar 20 22:16:37.086458 kernel: iommu: Default domain type: Translated Mar 20 22:16:37.086468 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 20 22:16:37.086478 kernel: PCI: Using ACPI for IRQ routing Mar 20 22:16:37.086488 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 20 22:16:37.086498 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 20 22:16:37.086507 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] Mar 20 22:16:37.086606 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Mar 20 22:16:37.086706 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Mar 20 22:16:37.087684 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 20 22:16:37.087704 kernel: vgaarb: loaded Mar 20 22:16:37.087714 kernel: clocksource: Switched to clocksource kvm-clock Mar 20 22:16:37.087724 kernel: VFS: Disk quotas dquot_6.6.0 Mar 20 22:16:37.087733 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 20 22:16:37.087743 kernel: pnp: PnP ACPI init Mar 20 22:16:37.087878 kernel: pnp 00:03: [dma 2] Mar 20 22:16:37.087894 kernel: pnp: PnP ACPI: found 5 devices Mar 20 22:16:37.087904 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 20 22:16:37.087917 kernel: NET: Registered PF_INET protocol family Mar 20 22:16:37.087926 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 20 22:16:37.087936 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 20 22:16:37.087945 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 20 22:16:37.087954 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 20 22:16:37.087964 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 20 22:16:37.087973 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 20 22:16:37.087982 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 22:16:37.087991 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 22:16:37.088002 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 20 22:16:37.088012 kernel: NET: Registered PF_XDP protocol family Mar 20 22:16:37.088099 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 20 22:16:37.088183 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 20 22:16:37.088263 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 20 22:16:37.088343 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Mar 20 22:16:37.088425 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] Mar 20 22:16:37.088521 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Mar 20 22:16:37.088622 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 20 22:16:37.088636 kernel: PCI: CLS 0 bytes, default 64 Mar 20 22:16:37.088646 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 20 22:16:37.088655 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) Mar 20 22:16:37.088664 kernel: Initialise system trusted keyrings Mar 20 22:16:37.088673 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 20 22:16:37.088682 kernel: Key type asymmetric registered Mar 20 22:16:37.088691 kernel: Asymmetric key parser 'x509' registered Mar 20 22:16:37.088700 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 20 22:16:37.088713 kernel: io scheduler mq-deadline registered Mar 20 22:16:37.088722 kernel: io scheduler kyber registered Mar 20 22:16:37.088731 kernel: io scheduler bfq registered Mar 20 22:16:37.088740 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 20 22:16:37.088750 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Mar 20 22:16:37.089529 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Mar 20 22:16:37.089543 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Mar 20 22:16:37.089553 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Mar 20 22:16:37.089563 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 20 22:16:37.089577 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 20 22:16:37.089586 kernel: random: crng init done Mar 20 22:16:37.089596 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 20 22:16:37.089606 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 20 22:16:37.089616 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 20 22:16:37.089722 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 20 22:16:37.089744 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 20 22:16:37.089869 kernel: rtc_cmos 00:04: registered as rtc0 Mar 20 22:16:37.089966 kernel: rtc_cmos 00:04: setting system clock to 2025-03-20T22:16:36 UTC (1742508996) Mar 20 22:16:37.090054 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Mar 20 22:16:37.090069 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 20 22:16:37.090080 kernel: NET: Registered PF_INET6 protocol family Mar 20 22:16:37.090089 kernel: Segment Routing with IPv6 Mar 20 22:16:37.090099 kernel: In-situ OAM (IOAM) with IPv6 Mar 20 22:16:37.090109 kernel: NET: Registered PF_PACKET protocol family Mar 20 22:16:37.090118 kernel: Key type dns_resolver registered Mar 20 22:16:37.090128 kernel: IPI shorthand broadcast: enabled Mar 20 22:16:37.090141 kernel: sched_clock: Marking stable (992028502, 167668681)->(1198771185, -39074002) Mar 20 22:16:37.090151 kernel: registered taskstats version 1 Mar 20 22:16:37.090161 kernel: Loading compiled-in X.509 certificates Mar 20 22:16:37.090171 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 9e7923b67df1c6f0613bc4380f7ea8de9ce851ac' Mar 20 22:16:37.090180 kernel: Key type .fscrypt registered Mar 20 22:16:37.090190 kernel: Key type fscrypt-provisioning registered Mar 20 22:16:37.090200 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 20 22:16:37.090209 kernel: ima: Allocated hash algorithm: sha1 Mar 20 22:16:37.090221 kernel: ima: No architecture policies found Mar 20 22:16:37.090230 kernel: clk: Disabling unused clocks Mar 20 22:16:37.090240 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 20 22:16:37.090250 kernel: Write protecting the kernel read-only data: 40960k Mar 20 22:16:37.090260 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 20 22:16:37.090270 kernel: Run /init as init process Mar 20 22:16:37.090279 kernel: with arguments: Mar 20 22:16:37.090289 kernel: /init Mar 20 22:16:37.090298 kernel: with environment: Mar 20 22:16:37.090308 kernel: HOME=/ Mar 20 22:16:37.090319 kernel: TERM=linux Mar 20 22:16:37.090329 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 20 22:16:37.090340 systemd[1]: Successfully made /usr/ read-only. Mar 20 22:16:37.090354 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 22:16:37.090366 systemd[1]: Detected virtualization kvm. Mar 20 22:16:37.090376 systemd[1]: Detected architecture x86-64. Mar 20 22:16:37.090386 systemd[1]: Running in initrd. Mar 20 22:16:37.090398 systemd[1]: No hostname configured, using default hostname. Mar 20 22:16:37.090409 systemd[1]: Hostname set to . Mar 20 22:16:37.090419 systemd[1]: Initializing machine ID from VM UUID. Mar 20 22:16:37.090430 systemd[1]: Queued start job for default target initrd.target. Mar 20 22:16:37.090440 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 22:16:37.090451 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 22:16:37.090471 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 20 22:16:37.090483 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 22:16:37.090496 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 20 22:16:37.090507 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 20 22:16:37.090519 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 20 22:16:37.090531 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 20 22:16:37.090543 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 22:16:37.090554 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 22:16:37.090565 systemd[1]: Reached target paths.target - Path Units. Mar 20 22:16:37.090575 systemd[1]: Reached target slices.target - Slice Units. Mar 20 22:16:37.090586 systemd[1]: Reached target swap.target - Swaps. Mar 20 22:16:37.090597 systemd[1]: Reached target timers.target - Timer Units. Mar 20 22:16:37.090607 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 22:16:37.090618 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 22:16:37.090629 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 20 22:16:37.090642 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 20 22:16:37.090652 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 22:16:37.090663 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 22:16:37.090674 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 22:16:37.090685 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 22:16:37.090695 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 20 22:16:37.090706 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 22:16:37.090717 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 20 22:16:37.090729 systemd[1]: Starting systemd-fsck-usr.service... Mar 20 22:16:37.090740 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 22:16:37.090751 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 22:16:37.090806 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:16:37.090818 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 20 22:16:37.090829 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 22:16:37.090867 systemd-journald[183]: Collecting audit messages is disabled. Mar 20 22:16:37.090895 systemd[1]: Finished systemd-fsck-usr.service. Mar 20 22:16:37.090908 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 20 22:16:37.090920 systemd-journald[183]: Journal started Mar 20 22:16:37.090948 systemd-journald[183]: Runtime Journal (/run/log/journal/e0e46e0b46814b22bfeef3f66e61be36) is 8M, max 78.2M, 70.2M free. Mar 20 22:16:37.091980 systemd-modules-load[185]: Inserted module 'overlay' Mar 20 22:16:37.096790 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 22:16:37.100905 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 22:16:37.148117 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 20 22:16:37.148147 kernel: Bridge firewalling registered Mar 20 22:16:37.126783 systemd-modules-load[185]: Inserted module 'br_netfilter' Mar 20 22:16:37.157282 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 22:16:37.158589 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:16:37.159313 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 22:16:37.161912 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 22:16:37.166139 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 22:16:37.169871 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 22:16:37.174384 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 22:16:37.183059 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 22:16:37.184349 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 22:16:37.187932 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 22:16:37.197947 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:16:37.199885 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 20 22:16:37.221754 dracut-cmdline[221]: dracut-dracut-053 Mar 20 22:16:37.224078 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 22:16:37.235252 systemd-resolved[215]: Positive Trust Anchors: Mar 20 22:16:37.235270 systemd-resolved[215]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 22:16:37.235314 systemd-resolved[215]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 22:16:37.238594 systemd-resolved[215]: Defaulting to hostname 'linux'. Mar 20 22:16:37.239969 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 22:16:37.241295 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 22:16:37.306793 kernel: SCSI subsystem initialized Mar 20 22:16:37.323866 kernel: Loading iSCSI transport class v2.0-870. Mar 20 22:16:37.340836 kernel: iscsi: registered transport (tcp) Mar 20 22:16:37.367160 kernel: iscsi: registered transport (qla4xxx) Mar 20 22:16:37.367302 kernel: QLogic iSCSI HBA Driver Mar 20 22:16:37.445749 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 20 22:16:37.449704 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 20 22:16:37.529267 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 20 22:16:37.529411 kernel: device-mapper: uevent: version 1.0.3 Mar 20 22:16:37.532820 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 20 22:16:37.606886 kernel: raid6: sse2x4 gen() 5213 MB/s Mar 20 22:16:37.625873 kernel: raid6: sse2x2 gen() 5900 MB/s Mar 20 22:16:37.644253 kernel: raid6: sse2x1 gen() 9002 MB/s Mar 20 22:16:37.644322 kernel: raid6: using algorithm sse2x1 gen() 9002 MB/s Mar 20 22:16:37.663245 kernel: raid6: .... xor() 7288 MB/s, rmw enabled Mar 20 22:16:37.663327 kernel: raid6: using ssse3x2 recovery algorithm Mar 20 22:16:37.686333 kernel: xor: measuring software checksum speed Mar 20 22:16:37.686398 kernel: prefetch64-sse : 18471 MB/sec Mar 20 22:16:37.687590 kernel: generic_sse : 16813 MB/sec Mar 20 22:16:37.687637 kernel: xor: using function: prefetch64-sse (18471 MB/sec) Mar 20 22:16:37.870272 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 20 22:16:37.886689 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 20 22:16:37.890783 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 22:16:37.942295 systemd-udevd[403]: Using default interface naming scheme 'v255'. Mar 20 22:16:37.954364 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 22:16:37.964085 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 20 22:16:38.003471 dracut-pre-trigger[411]: rd.md=0: removing MD RAID activation Mar 20 22:16:38.056238 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 22:16:38.060581 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 22:16:38.148061 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 22:16:38.153160 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 20 22:16:38.193992 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 20 22:16:38.198971 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 22:16:38.200546 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 22:16:38.203671 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 22:16:38.209056 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 20 22:16:38.241938 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 20 22:16:38.254891 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Mar 20 22:16:38.293410 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) Mar 20 22:16:38.294137 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 20 22:16:38.294153 kernel: GPT:17805311 != 20971519 Mar 20 22:16:38.294165 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 20 22:16:38.294177 kernel: GPT:17805311 != 20971519 Mar 20 22:16:38.294199 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 20 22:16:38.294211 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 22:16:38.294785 kernel: libata version 3.00 loaded. Mar 20 22:16:38.300790 kernel: ata_piix 0000:00:01.1: version 2.13 Mar 20 22:16:38.327672 kernel: scsi host0: ata_piix Mar 20 22:16:38.327853 kernel: scsi host1: ata_piix Mar 20 22:16:38.327965 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Mar 20 22:16:38.327979 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Mar 20 22:16:38.321793 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 22:16:38.322012 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:16:38.323819 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 22:16:38.326192 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 22:16:38.326389 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:16:38.354919 kernel: BTRFS: device fsid 48a514e8-9ecc-46c2-935b-caca347f921e devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (471) Mar 20 22:16:38.354951 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (461) Mar 20 22:16:38.327589 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:16:38.331480 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:16:38.351910 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 20 22:16:38.381039 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 20 22:16:38.419675 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 20 22:16:38.421825 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:16:38.442231 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 20 22:16:38.453095 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 20 22:16:38.470929 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 22:16:38.473898 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 20 22:16:38.477322 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 22:16:38.506808 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:16:38.516604 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 22:16:38.516652 disk-uuid[503]: Primary Header is updated. Mar 20 22:16:38.516652 disk-uuid[503]: Secondary Entries is updated. Mar 20 22:16:38.516652 disk-uuid[503]: Secondary Header is updated. Mar 20 22:16:39.529829 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 22:16:39.531888 disk-uuid[512]: The operation has completed successfully. Mar 20 22:16:39.625839 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 20 22:16:39.625930 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 20 22:16:39.663855 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 20 22:16:39.678839 sh[524]: Success Mar 20 22:16:39.701839 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Mar 20 22:16:39.788020 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 20 22:16:39.789941 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 20 22:16:39.794328 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 20 22:16:39.821248 kernel: BTRFS info (device dm-0): first mount of filesystem 48a514e8-9ecc-46c2-935b-caca347f921e Mar 20 22:16:39.821315 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:16:39.823174 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 20 22:16:39.825111 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 20 22:16:39.827605 kernel: BTRFS info (device dm-0): using free space tree Mar 20 22:16:39.842243 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 20 22:16:39.844450 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 20 22:16:39.847597 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 20 22:16:39.851996 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 20 22:16:39.896027 kernel: BTRFS info (device vda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:16:39.896110 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:16:39.902334 kernel: BTRFS info (device vda6): using free space tree Mar 20 22:16:39.910865 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 22:16:39.917603 kernel: BTRFS info (device vda6): last unmount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:16:39.925293 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 20 22:16:39.929640 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 20 22:16:39.999862 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 22:16:40.002282 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 22:16:40.047044 systemd-networkd[703]: lo: Link UP Mar 20 22:16:40.047851 systemd-networkd[703]: lo: Gained carrier Mar 20 22:16:40.049689 systemd-networkd[703]: Enumeration completed Mar 20 22:16:40.049796 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 22:16:40.050376 systemd[1]: Reached target network.target - Network. Mar 20 22:16:40.052878 systemd-networkd[703]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:16:40.052882 systemd-networkd[703]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 22:16:40.055716 systemd-networkd[703]: eth0: Link UP Mar 20 22:16:40.055719 systemd-networkd[703]: eth0: Gained carrier Mar 20 22:16:40.055727 systemd-networkd[703]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:16:40.072813 systemd-networkd[703]: eth0: DHCPv4 address 172.24.4.53/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 20 22:16:40.079818 ignition[636]: Ignition 2.20.0 Mar 20 22:16:40.080380 ignition[636]: Stage: fetch-offline Mar 20 22:16:40.080438 ignition[636]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:16:40.080449 ignition[636]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:16:40.082033 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 22:16:40.080559 ignition[636]: parsed url from cmdline: "" Mar 20 22:16:40.080564 ignition[636]: no config URL provided Mar 20 22:16:40.080570 ignition[636]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 22:16:40.080578 ignition[636]: no config at "/usr/lib/ignition/user.ign" Mar 20 22:16:40.084881 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 20 22:16:40.080585 ignition[636]: failed to fetch config: resource requires networking Mar 20 22:16:40.080810 ignition[636]: Ignition finished successfully Mar 20 22:16:40.106217 ignition[714]: Ignition 2.20.0 Mar 20 22:16:40.106229 ignition[714]: Stage: fetch Mar 20 22:16:40.106418 ignition[714]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:16:40.106429 ignition[714]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:16:40.106557 ignition[714]: parsed url from cmdline: "" Mar 20 22:16:40.106561 ignition[714]: no config URL provided Mar 20 22:16:40.106568 ignition[714]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 22:16:40.106578 ignition[714]: no config at "/usr/lib/ignition/user.ign" Mar 20 22:16:40.106684 ignition[714]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 20 22:16:40.106709 ignition[714]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 20 22:16:40.106739 ignition[714]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 20 22:16:40.351037 systemd-resolved[215]: Detected conflict on linux IN A 172.24.4.53 Mar 20 22:16:40.351076 systemd-resolved[215]: Hostname conflict, changing published hostname from 'linux' to 'linux3'. Mar 20 22:16:40.371387 ignition[714]: GET result: OK Mar 20 22:16:40.371616 ignition[714]: parsing config with SHA512: a36bf6f58da8c09fc2e76e955b6f298c0b9fc58adfad15f452018b710f7699d00c514f2dcb76a89dd002f4389b58d4b42012c0066acfce4a4ed3c79baef889f6 Mar 20 22:16:40.383040 unknown[714]: fetched base config from "system" Mar 20 22:16:40.383109 unknown[714]: fetched base config from "system" Mar 20 22:16:40.384103 ignition[714]: fetch: fetch complete Mar 20 22:16:40.383125 unknown[714]: fetched user config from "openstack" Mar 20 22:16:40.384116 ignition[714]: fetch: fetch passed Mar 20 22:16:40.390252 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 20 22:16:40.384203 ignition[714]: Ignition finished successfully Mar 20 22:16:40.395070 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 20 22:16:40.440579 ignition[720]: Ignition 2.20.0 Mar 20 22:16:40.440604 ignition[720]: Stage: kargs Mar 20 22:16:40.441049 ignition[720]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:16:40.441077 ignition[720]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:16:40.445817 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 20 22:16:40.443390 ignition[720]: kargs: kargs passed Mar 20 22:16:40.450193 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 20 22:16:40.443498 ignition[720]: Ignition finished successfully Mar 20 22:16:40.485592 ignition[727]: Ignition 2.20.0 Mar 20 22:16:40.485620 ignition[727]: Stage: disks Mar 20 22:16:40.486085 ignition[727]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:16:40.486113 ignition[727]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:16:40.490436 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 20 22:16:40.488414 ignition[727]: disks: disks passed Mar 20 22:16:40.493970 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 20 22:16:40.488510 ignition[727]: Ignition finished successfully Mar 20 22:16:40.495931 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 20 22:16:40.498275 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 22:16:40.501198 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 22:16:40.503615 systemd[1]: Reached target basic.target - Basic System. Mar 20 22:16:40.509994 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 20 22:16:40.557187 systemd-fsck[735]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 20 22:16:40.571828 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 20 22:16:40.578312 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 20 22:16:40.732809 kernel: EXT4-fs (vda9): mounted filesystem 79cdbe74-6884-4c57-b04d-c9a431509f16 r/w with ordered data mode. Quota mode: none. Mar 20 22:16:40.733275 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 20 22:16:40.734787 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 20 22:16:40.737582 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 22:16:40.740823 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 20 22:16:40.741518 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 20 22:16:40.744869 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 20 22:16:40.747920 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 20 22:16:40.748980 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 22:16:40.758194 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 20 22:16:40.762109 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 20 22:16:40.775250 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (743) Mar 20 22:16:40.787791 kernel: BTRFS info (device vda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:16:40.791388 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:16:40.791415 kernel: BTRFS info (device vda6): using free space tree Mar 20 22:16:40.803804 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 22:16:40.809222 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 22:16:40.887570 initrd-setup-root[774]: cut: /sysroot/etc/passwd: No such file or directory Mar 20 22:16:40.893576 initrd-setup-root[781]: cut: /sysroot/etc/group: No such file or directory Mar 20 22:16:40.900717 initrd-setup-root[788]: cut: /sysroot/etc/shadow: No such file or directory Mar 20 22:16:40.905702 initrd-setup-root[795]: cut: /sysroot/etc/gshadow: No such file or directory Mar 20 22:16:41.032557 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 20 22:16:41.036708 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 20 22:16:41.041046 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 20 22:16:41.060362 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 20 22:16:41.068541 kernel: BTRFS info (device vda6): last unmount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:16:41.096852 ignition[864]: INFO : Ignition 2.20.0 Mar 20 22:16:41.096852 ignition[864]: INFO : Stage: mount Mar 20 22:16:41.103190 ignition[864]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 22:16:41.103190 ignition[864]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:16:41.103190 ignition[864]: INFO : mount: mount passed Mar 20 22:16:41.103190 ignition[864]: INFO : Ignition finished successfully Mar 20 22:16:41.100320 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 20 22:16:41.110094 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 20 22:16:41.834147 systemd-networkd[703]: eth0: Gained IPv6LL Mar 20 22:16:47.934141 coreos-metadata[745]: Mar 20 22:16:47.934 WARN failed to locate config-drive, using the metadata service API instead Mar 20 22:16:47.975539 coreos-metadata[745]: Mar 20 22:16:47.975 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 20 22:16:47.992085 coreos-metadata[745]: Mar 20 22:16:47.992 INFO Fetch successful Mar 20 22:16:47.993340 coreos-metadata[745]: Mar 20 22:16:47.993 INFO wrote hostname ci-9999-0-2-b-c50fddf147.novalocal to /sysroot/etc/hostname Mar 20 22:16:47.995991 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 20 22:16:47.996217 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 20 22:16:48.004053 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 20 22:16:48.039270 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 22:16:48.073870 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (879) Mar 20 22:16:48.081135 kernel: BTRFS info (device vda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:16:48.081203 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:16:48.085310 kernel: BTRFS info (device vda6): using free space tree Mar 20 22:16:48.097874 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 22:16:48.104367 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 22:16:48.148690 ignition[897]: INFO : Ignition 2.20.0 Mar 20 22:16:48.148690 ignition[897]: INFO : Stage: files Mar 20 22:16:48.150368 ignition[897]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 22:16:48.150368 ignition[897]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:16:48.152291 ignition[897]: DEBUG : files: compiled without relabeling support, skipping Mar 20 22:16:48.155129 ignition[897]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 20 22:16:48.155129 ignition[897]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 20 22:16:48.164000 ignition[897]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 20 22:16:48.165402 ignition[897]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 20 22:16:48.166908 unknown[897]: wrote ssh authorized keys file for user: core Mar 20 22:16:48.167923 ignition[897]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 20 22:16:48.170623 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 22:16:48.171837 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 20 22:16:48.221991 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 20 22:16:48.547236 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 22:16:48.547236 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 20 22:16:48.551932 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 20 22:16:48.551932 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 20 22:16:48.551932 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 20 22:16:48.551932 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 22:16:48.551932 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 22:16:48.551932 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 22:16:48.551932 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 22:16:48.566474 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 22:16:48.566474 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 22:16:48.566474 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 22:16:48.566474 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 22:16:48.566474 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 22:16:48.566474 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 20 22:16:49.208684 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 20 22:16:50.827560 ignition[897]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 22:16:50.827560 ignition[897]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 20 22:16:50.832517 ignition[897]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 22:16:50.832517 ignition[897]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 22:16:50.832517 ignition[897]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 20 22:16:50.832517 ignition[897]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 20 22:16:50.832517 ignition[897]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 20 22:16:50.832517 ignition[897]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 20 22:16:50.832517 ignition[897]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 20 22:16:50.832517 ignition[897]: INFO : files: files passed Mar 20 22:16:50.832517 ignition[897]: INFO : Ignition finished successfully Mar 20 22:16:50.831141 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 20 22:16:50.837893 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 20 22:16:50.839899 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 20 22:16:50.862255 initrd-setup-root-after-ignition[925]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 22:16:50.862255 initrd-setup-root-after-ignition[925]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 20 22:16:50.857969 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 20 22:16:50.868730 initrd-setup-root-after-ignition[930]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 22:16:50.858057 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 20 22:16:50.860684 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 22:16:50.863204 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 20 22:16:50.865869 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 20 22:16:50.926386 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 20 22:16:50.926574 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 20 22:16:50.930842 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 20 22:16:50.932249 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 20 22:16:50.934073 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 20 22:16:50.935584 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 20 22:16:50.961014 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 22:16:50.963372 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 20 22:16:50.994118 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 20 22:16:50.995689 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 22:16:50.997728 systemd[1]: Stopped target timers.target - Timer Units. Mar 20 22:16:51.000253 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 20 22:16:51.000543 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 22:16:51.003466 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 20 22:16:51.005205 systemd[1]: Stopped target basic.target - Basic System. Mar 20 22:16:51.007999 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 20 22:16:51.010489 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 22:16:51.013014 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 20 22:16:51.015843 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 20 22:16:51.018619 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 22:16:51.021631 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 20 22:16:51.024495 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 20 22:16:51.027371 systemd[1]: Stopped target swap.target - Swaps. Mar 20 22:16:51.029953 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 20 22:16:51.030233 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 20 22:16:51.033278 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 20 22:16:51.035181 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 22:16:51.037576 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 20 22:16:51.037847 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 22:16:51.040649 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 20 22:16:51.040987 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 20 22:16:51.044801 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 20 22:16:51.045127 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 22:16:51.048023 systemd[1]: ignition-files.service: Deactivated successfully. Mar 20 22:16:51.048292 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 20 22:16:51.055133 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 20 22:16:51.056593 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 20 22:16:51.057046 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 22:16:51.065093 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 20 22:16:51.066355 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 20 22:16:51.066668 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 22:16:51.070349 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 20 22:16:51.070551 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 22:16:51.080952 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 20 22:16:51.081044 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 20 22:16:51.095628 ignition[951]: INFO : Ignition 2.20.0 Mar 20 22:16:51.095628 ignition[951]: INFO : Stage: umount Mar 20 22:16:51.097803 ignition[951]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 22:16:51.097803 ignition[951]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:16:51.097803 ignition[951]: INFO : umount: umount passed Mar 20 22:16:51.097803 ignition[951]: INFO : Ignition finished successfully Mar 20 22:16:51.098523 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 20 22:16:51.098622 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 20 22:16:51.099499 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 20 22:16:51.099570 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 20 22:16:51.100279 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 20 22:16:51.100324 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 20 22:16:51.100956 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 20 22:16:51.100998 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 20 22:16:51.102154 systemd[1]: Stopped target network.target - Network. Mar 20 22:16:51.103003 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 20 22:16:51.103048 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 22:16:51.104078 systemd[1]: Stopped target paths.target - Path Units. Mar 20 22:16:51.105520 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 20 22:16:51.112674 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 22:16:51.113348 systemd[1]: Stopped target slices.target - Slice Units. Mar 20 22:16:51.114467 systemd[1]: Stopped target sockets.target - Socket Units. Mar 20 22:16:51.116844 systemd[1]: iscsid.socket: Deactivated successfully. Mar 20 22:16:51.116883 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 22:16:51.117866 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 20 22:16:51.117900 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 22:16:51.118404 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 20 22:16:51.118449 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 20 22:16:51.118960 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 20 22:16:51.119636 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 20 22:16:51.120333 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 20 22:16:51.122208 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 20 22:16:51.124875 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 20 22:16:51.130208 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 20 22:16:51.130751 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 20 22:16:51.134002 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 20 22:16:51.134218 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 20 22:16:51.134299 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 20 22:16:51.136204 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 20 22:16:51.136826 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 20 22:16:51.136998 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 20 22:16:51.144368 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 20 22:16:51.144942 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 20 22:16:51.144993 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 22:16:51.145996 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 20 22:16:51.146037 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 20 22:16:51.148875 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 20 22:16:51.148917 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 20 22:16:51.150719 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 20 22:16:51.150777 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 22:16:51.152045 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 22:16:51.153732 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 20 22:16:51.155153 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 20 22:16:51.158161 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 20 22:16:51.158376 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 22:16:51.161017 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 20 22:16:51.161987 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 20 22:16:51.163180 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 20 22:16:51.163213 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 22:16:51.164894 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 20 22:16:51.164941 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 20 22:16:51.166390 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 20 22:16:51.166435 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 20 22:16:51.168019 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 22:16:51.168065 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:16:51.171887 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 20 22:16:51.173118 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 20 22:16:51.173816 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 22:16:51.175130 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 22:16:51.175170 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:16:51.178039 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 20 22:16:51.178099 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 20 22:16:51.183005 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 20 22:16:51.183106 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 20 22:16:51.187921 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 20 22:16:51.188018 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 20 22:16:51.223567 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 20 22:16:51.223783 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 20 22:16:51.226060 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 20 22:16:51.227457 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 20 22:16:51.227543 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 20 22:16:51.231923 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 20 22:16:51.264819 systemd[1]: Switching root. Mar 20 22:16:51.306362 systemd-journald[183]: Journal stopped Mar 20 22:16:53.197083 systemd-journald[183]: Received SIGTERM from PID 1 (systemd). Mar 20 22:16:53.197131 kernel: SELinux: policy capability network_peer_controls=1 Mar 20 22:16:53.197152 kernel: SELinux: policy capability open_perms=1 Mar 20 22:16:53.197170 kernel: SELinux: policy capability extended_socket_class=1 Mar 20 22:16:53.197182 kernel: SELinux: policy capability always_check_network=0 Mar 20 22:16:53.197196 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 20 22:16:53.197209 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 20 22:16:53.197220 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 20 22:16:53.197232 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 20 22:16:53.197244 kernel: audit: type=1403 audit(1742509012.024:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 20 22:16:53.197259 systemd[1]: Successfully loaded SELinux policy in 74.893ms. Mar 20 22:16:53.197282 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 31.509ms. Mar 20 22:16:53.197299 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 22:16:53.197312 systemd[1]: Detected virtualization kvm. Mar 20 22:16:53.197325 systemd[1]: Detected architecture x86-64. Mar 20 22:16:53.197338 systemd[1]: Detected first boot. Mar 20 22:16:53.197351 systemd[1]: Hostname set to . Mar 20 22:16:53.197364 systemd[1]: Initializing machine ID from VM UUID. Mar 20 22:16:53.197377 zram_generator::config[997]: No configuration found. Mar 20 22:16:53.197390 kernel: Guest personality initialized and is inactive Mar 20 22:16:53.197404 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 20 22:16:53.197416 kernel: Initialized host personality Mar 20 22:16:53.197428 kernel: NET: Registered PF_VSOCK protocol family Mar 20 22:16:53.197440 systemd[1]: Populated /etc with preset unit settings. Mar 20 22:16:53.197454 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 20 22:16:53.197468 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 20 22:16:53.197481 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 20 22:16:53.197494 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 20 22:16:53.197509 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 20 22:16:53.197522 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 20 22:16:53.197535 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 20 22:16:53.197548 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 20 22:16:53.197561 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 20 22:16:53.197574 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 20 22:16:53.197587 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 20 22:16:53.197600 systemd[1]: Created slice user.slice - User and Session Slice. Mar 20 22:16:53.197613 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 22:16:53.197628 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 22:16:53.197641 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 20 22:16:53.197654 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 20 22:16:53.197668 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 20 22:16:53.197681 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 22:16:53.197693 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 20 22:16:53.197708 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 22:16:53.197721 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 20 22:16:53.197734 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 20 22:16:53.197748 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 20 22:16:53.198813 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 20 22:16:53.198837 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 22:16:53.198852 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 22:16:53.198865 systemd[1]: Reached target slices.target - Slice Units. Mar 20 22:16:53.198879 systemd[1]: Reached target swap.target - Swaps. Mar 20 22:16:53.198897 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 20 22:16:53.198910 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 20 22:16:53.198924 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 20 22:16:53.198937 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 22:16:53.198950 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 22:16:53.198963 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 22:16:53.198975 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 20 22:16:53.198988 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 20 22:16:53.199001 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 20 22:16:53.199019 systemd[1]: Mounting media.mount - External Media Directory... Mar 20 22:16:53.199032 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:16:53.199064 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 20 22:16:53.199078 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 20 22:16:53.199091 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 20 22:16:53.199104 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 20 22:16:53.199117 systemd[1]: Reached target machines.target - Containers. Mar 20 22:16:53.199130 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 20 22:16:53.199143 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 22:16:53.199159 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 22:16:53.199172 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 20 22:16:53.199185 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 22:16:53.199199 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 22:16:53.199212 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 22:16:53.199225 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 20 22:16:53.199238 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 22:16:53.199251 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 20 22:16:53.199266 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 20 22:16:53.199279 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 20 22:16:53.199292 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 20 22:16:53.199305 systemd[1]: Stopped systemd-fsck-usr.service. Mar 20 22:16:53.199318 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 22:16:53.199331 kernel: fuse: init (API version 7.39) Mar 20 22:16:53.199344 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 22:16:53.199357 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 22:16:53.199370 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 20 22:16:53.199385 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 20 22:16:53.199398 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 20 22:16:53.199410 kernel: ACPI: bus type drm_connector registered Mar 20 22:16:53.199423 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 22:16:53.199436 systemd[1]: verity-setup.service: Deactivated successfully. Mar 20 22:16:53.199451 systemd[1]: Stopped verity-setup.service. Mar 20 22:16:53.199465 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:16:53.199478 kernel: loop: module loaded Mar 20 22:16:53.199490 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 20 22:16:53.199502 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 20 22:16:53.199518 systemd[1]: Mounted media.mount - External Media Directory. Mar 20 22:16:53.199530 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 20 22:16:53.199541 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 20 22:16:53.199572 systemd-journald[1086]: Collecting audit messages is disabled. Mar 20 22:16:53.199599 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 20 22:16:53.199612 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 22:16:53.199624 systemd-journald[1086]: Journal started Mar 20 22:16:53.199652 systemd-journald[1086]: Runtime Journal (/run/log/journal/e0e46e0b46814b22bfeef3f66e61be36) is 8M, max 78.2M, 70.2M free. Mar 20 22:16:52.833284 systemd[1]: Queued start job for default target multi-user.target. Mar 20 22:16:53.206358 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 22:16:52.841896 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 20 22:16:52.842271 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 20 22:16:53.205179 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 20 22:16:53.205347 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 20 22:16:53.206087 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 22:16:53.206824 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 22:16:53.208180 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 22:16:53.208315 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 22:16:53.209947 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 22:16:53.210892 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 22:16:53.211680 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 20 22:16:53.212685 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 20 22:16:53.216331 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 22:16:53.216865 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 22:16:53.218637 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 22:16:53.220925 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 20 22:16:53.240885 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 20 22:16:53.245862 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 20 22:16:53.247366 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 20 22:16:53.247399 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 22:16:53.253425 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 20 22:16:53.257882 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 20 22:16:53.261898 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 20 22:16:53.267981 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 22:16:53.272350 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 20 22:16:53.275873 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 20 22:16:53.280267 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 22:16:53.285522 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 20 22:16:53.286479 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 22:16:53.290498 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 22:16:53.292880 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 20 22:16:53.293899 systemd-journald[1086]: Time spent on flushing to /var/log/journal/e0e46e0b46814b22bfeef3f66e61be36 is 37.654ms for 952 entries. Mar 20 22:16:53.293899 systemd-journald[1086]: System Journal (/var/log/journal/e0e46e0b46814b22bfeef3f66e61be36) is 8M, max 584.8M, 576.8M free. Mar 20 22:16:53.347331 systemd-journald[1086]: Received client request to flush runtime journal. Mar 20 22:16:53.347370 kernel: loop0: detected capacity change from 0 to 8 Mar 20 22:16:53.347386 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 20 22:16:53.347407 kernel: loop1: detected capacity change from 0 to 210664 Mar 20 22:16:53.300400 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 20 22:16:53.301457 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 20 22:16:53.302415 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 20 22:16:53.303781 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 22:16:53.305245 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 20 22:16:53.306136 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 20 22:16:53.307013 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 20 22:16:53.315181 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 20 22:16:53.335917 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 20 22:16:53.353581 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 20 22:16:53.363710 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 20 22:16:53.364600 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 20 22:16:53.368193 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 20 22:16:53.373192 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 20 22:16:53.374145 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 22:16:53.396408 udevadm[1145]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 20 22:16:53.437776 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 20 22:16:53.439863 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 22:16:53.445914 kernel: loop2: detected capacity change from 0 to 151640 Mar 20 22:16:53.452821 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 20 22:16:53.476700 systemd-tmpfiles[1155]: ACLs are not supported, ignoring. Mar 20 22:16:53.476719 systemd-tmpfiles[1155]: ACLs are not supported, ignoring. Mar 20 22:16:53.481556 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 22:16:53.531810 kernel: loop3: detected capacity change from 0 to 109808 Mar 20 22:16:53.587335 kernel: loop4: detected capacity change from 0 to 8 Mar 20 22:16:53.589808 kernel: loop5: detected capacity change from 0 to 210664 Mar 20 22:16:53.651184 kernel: loop6: detected capacity change from 0 to 151640 Mar 20 22:16:53.777467 kernel: loop7: detected capacity change from 0 to 109808 Mar 20 22:16:53.811794 (sd-merge)[1161]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 20 22:16:53.813265 (sd-merge)[1161]: Merged extensions into '/usr'. Mar 20 22:16:53.827371 systemd[1]: Reload requested from client PID 1133 ('systemd-sysext') (unit systemd-sysext.service)... Mar 20 22:16:53.827427 systemd[1]: Reloading... Mar 20 22:16:53.926103 zram_generator::config[1189]: No configuration found. Mar 20 22:16:54.134670 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:16:54.227619 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 20 22:16:54.228127 systemd[1]: Reloading finished in 399 ms. Mar 20 22:16:54.248689 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 20 22:16:54.249619 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 20 22:16:54.265883 systemd[1]: Starting ensure-sysext.service... Mar 20 22:16:54.267506 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 22:16:54.272023 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 22:16:54.299834 systemd[1]: Reload requested from client PID 1245 ('systemctl') (unit ensure-sysext.service)... Mar 20 22:16:54.299850 systemd[1]: Reloading... Mar 20 22:16:54.313677 systemd-tmpfiles[1246]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 20 22:16:54.314322 systemd-tmpfiles[1246]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 20 22:16:54.315463 systemd-tmpfiles[1246]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 20 22:16:54.315936 systemd-tmpfiles[1246]: ACLs are not supported, ignoring. Mar 20 22:16:54.316068 systemd-tmpfiles[1246]: ACLs are not supported, ignoring. Mar 20 22:16:54.321914 systemd-tmpfiles[1246]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 22:16:54.322296 systemd-tmpfiles[1246]: Skipping /boot Mar 20 22:16:54.324885 systemd-udevd[1247]: Using default interface naming scheme 'v255'. Mar 20 22:16:54.339302 systemd-tmpfiles[1246]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 22:16:54.339318 systemd-tmpfiles[1246]: Skipping /boot Mar 20 22:16:54.358014 ldconfig[1125]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 20 22:16:54.426821 zram_generator::config[1298]: No configuration found. Mar 20 22:16:54.517940 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1275) Mar 20 22:16:54.518019 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Mar 20 22:16:54.520485 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 20 22:16:54.529978 kernel: ACPI: button: Power Button [PWRF] Mar 20 22:16:54.592788 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 20 22:16:54.643152 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Mar 20 22:16:54.643228 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Mar 20 22:16:54.660393 kernel: Console: switching to colour dummy device 80x25 Mar 20 22:16:54.660461 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 20 22:16:54.660481 kernel: [drm] features: -context_init Mar 20 22:16:54.666306 kernel: [drm] number of scanouts: 1 Mar 20 22:16:54.666353 kernel: [drm] number of cap sets: 0 Mar 20 22:16:54.668778 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Mar 20 22:16:54.686141 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 20 22:16:54.686216 kernel: Console: switching to colour frame buffer device 160x50 Mar 20 22:16:54.688785 kernel: mousedev: PS/2 mouse device common for all mice Mar 20 22:16:54.701600 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:16:54.702782 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 20 22:16:54.808580 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 20 22:16:54.808630 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 22:16:54.810554 systemd[1]: Reloading finished in 510 ms. Mar 20 22:16:54.825072 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 22:16:54.825573 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 20 22:16:54.834686 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 22:16:54.864284 systemd[1]: Finished ensure-sysext.service. Mar 20 22:16:54.872497 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 20 22:16:54.889101 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:16:54.890484 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 22:16:54.908430 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 20 22:16:54.909165 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 22:16:54.914500 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 20 22:16:54.920180 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 22:16:54.926182 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 22:16:54.934281 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 22:16:54.943313 lvm[1371]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 22:16:54.947294 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 22:16:54.949295 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 22:16:54.954297 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 20 22:16:54.955803 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 22:16:54.960489 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 20 22:16:54.969853 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 22:16:54.978410 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 22:16:54.985989 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 20 22:16:54.988971 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 20 22:16:54.991949 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:16:54.992947 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:16:54.993953 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 20 22:16:54.994267 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 22:16:54.994425 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 22:16:54.999075 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 22:16:54.999328 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 22:16:55.000334 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 22:16:55.000510 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 22:16:55.006639 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 22:16:55.006900 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 22:16:55.018330 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 22:16:55.022361 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 20 22:16:55.025073 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 22:16:55.025149 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 22:16:55.033246 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 20 22:16:55.042455 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 20 22:16:55.052036 lvm[1405]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 22:16:55.057850 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 20 22:16:55.064473 augenrules[1414]: No rules Mar 20 22:16:55.065207 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 22:16:55.067821 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 22:16:55.106864 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 20 22:16:55.109704 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 20 22:16:55.128530 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 20 22:16:55.133888 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 20 22:16:55.165997 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 20 22:16:55.231439 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 20 22:16:55.233073 systemd[1]: Reached target time-set.target - System Time Set. Mar 20 22:16:55.284186 systemd-resolved[1387]: Positive Trust Anchors: Mar 20 22:16:55.284529 systemd-resolved[1387]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 22:16:55.284568 systemd-networkd[1384]: lo: Link UP Mar 20 22:16:55.284575 systemd-resolved[1387]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 22:16:55.284576 systemd-networkd[1384]: lo: Gained carrier Mar 20 22:16:55.287384 systemd-networkd[1384]: Enumeration completed Mar 20 22:16:55.287548 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 22:16:55.288169 systemd-networkd[1384]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:16:55.288180 systemd-networkd[1384]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 22:16:55.290373 systemd-networkd[1384]: eth0: Link UP Mar 20 22:16:55.290380 systemd-networkd[1384]: eth0: Gained carrier Mar 20 22:16:55.290410 systemd-networkd[1384]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:16:55.293117 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 20 22:16:55.298838 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 20 22:16:55.303880 systemd-networkd[1384]: eth0: DHCPv4 address 172.24.4.53/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 20 22:16:55.304854 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Mar 20 22:16:55.306191 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Mar 20 22:16:55.312172 systemd-resolved[1387]: Using system hostname 'ci-9999-0-2-b-c50fddf147.novalocal'. Mar 20 22:16:55.315393 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 22:16:55.316061 systemd[1]: Reached target network.target - Network. Mar 20 22:16:55.316533 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 22:16:55.332630 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 20 22:16:55.359064 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:16:55.361849 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 20 22:16:55.364042 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 20 22:16:55.364101 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 22:16:55.365914 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 20 22:16:55.366830 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 20 22:16:55.368093 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 20 22:16:55.368870 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 20 22:16:55.370229 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 20 22:16:55.371503 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 20 22:16:55.371530 systemd[1]: Reached target paths.target - Path Units. Mar 20 22:16:55.372260 systemd[1]: Reached target timers.target - Timer Units. Mar 20 22:16:55.375893 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 20 22:16:55.378325 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 20 22:16:55.382524 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 20 22:16:55.384245 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 20 22:16:55.385571 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 20 22:16:55.389123 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 20 22:16:55.391025 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 20 22:16:55.393306 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 20 22:16:55.394808 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 22:16:55.396245 systemd[1]: Reached target basic.target - Basic System. Mar 20 22:16:55.397107 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 20 22:16:55.397212 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 20 22:16:55.400018 systemd[1]: Starting containerd.service - containerd container runtime... Mar 20 22:16:55.404162 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 20 22:16:55.411886 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 20 22:16:55.419251 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 20 22:16:55.425886 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 20 22:16:55.426538 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 20 22:16:55.430346 jq[1448]: false Mar 20 22:16:55.430819 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 20 22:16:55.435002 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 20 22:16:55.440941 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 20 22:16:55.451177 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 20 22:16:55.461056 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 20 22:16:55.462617 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 20 22:16:55.465304 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 20 22:16:55.467499 systemd[1]: Starting update-engine.service - Update Engine... Mar 20 22:16:55.474097 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 20 22:16:55.480554 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 20 22:16:55.481123 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 20 22:16:55.481391 systemd[1]: motdgen.service: Deactivated successfully. Mar 20 22:16:55.481881 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 20 22:16:55.493248 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 20 22:16:55.493821 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 20 22:16:55.503022 extend-filesystems[1449]: Found loop4 Mar 20 22:16:55.506983 dbus-daemon[1445]: [system] SELinux support is enabled Mar 20 22:16:55.514969 extend-filesystems[1449]: Found loop5 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found loop6 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found loop7 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda1 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda2 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda3 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found usr Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda4 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda6 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda7 Mar 20 22:16:55.514969 extend-filesystems[1449]: Found vda9 Mar 20 22:16:55.514969 extend-filesystems[1449]: Checking size of /dev/vda9 Mar 20 22:16:55.511131 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 20 22:16:55.531383 update_engine[1461]: I20250320 22:16:55.517510 1461 main.cc:92] Flatcar Update Engine starting Mar 20 22:16:55.531383 update_engine[1461]: I20250320 22:16:55.520696 1461 update_check_scheduler.cc:74] Next update check in 7m19s Mar 20 22:16:55.533978 jq[1462]: true Mar 20 22:16:55.538862 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 20 22:16:55.538917 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 20 22:16:55.539589 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 20 22:16:55.539611 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 20 22:16:55.544982 systemd[1]: Started update-engine.service - Update Engine. Mar 20 22:16:55.550944 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 20 22:16:55.558951 (ntainerd)[1475]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 20 22:16:55.565232 extend-filesystems[1449]: Resized partition /dev/vda9 Mar 20 22:16:55.590651 jq[1473]: true Mar 20 22:16:55.590747 tar[1465]: linux-amd64/helm Mar 20 22:16:55.596367 extend-filesystems[1485]: resize2fs 1.47.2 (1-Jan-2025) Mar 20 22:16:55.615813 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks Mar 20 22:16:55.607402 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 20 22:16:55.631442 kernel: EXT4-fs (vda9): resized filesystem to 2014203 Mar 20 22:16:55.710753 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1293) Mar 20 22:16:55.638681 systemd-logind[1460]: New seat seat0. Mar 20 22:16:55.708661 systemd-logind[1460]: Watching system buttons on /dev/input/event1 (Power Button) Mar 20 22:16:55.708679 systemd-logind[1460]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 20 22:16:55.712196 systemd[1]: Started systemd-logind.service - User Login Management. Mar 20 22:16:55.736322 extend-filesystems[1485]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 20 22:16:55.736322 extend-filesystems[1485]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 20 22:16:55.736322 extend-filesystems[1485]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. Mar 20 22:16:55.727286 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 20 22:16:55.748451 extend-filesystems[1449]: Resized filesystem in /dev/vda9 Mar 20 22:16:55.727499 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 20 22:16:55.762995 bash[1501]: Updated "/home/core/.ssh/authorized_keys" Mar 20 22:16:55.756105 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 20 22:16:55.773570 systemd[1]: Starting sshkeys.service... Mar 20 22:16:55.808468 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 20 22:16:55.815067 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 20 22:16:55.859377 locksmithd[1478]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 20 22:16:56.083430 containerd[1475]: time="2025-03-20T22:16:56Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 20 22:16:56.085186 containerd[1475]: time="2025-03-20T22:16:56.085160734Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 20 22:16:56.109221 containerd[1475]: time="2025-03-20T22:16:56.109183538Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.182µs" Mar 20 22:16:56.109271 containerd[1475]: time="2025-03-20T22:16:56.109219526Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 20 22:16:56.109271 containerd[1475]: time="2025-03-20T22:16:56.109240956Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 20 22:16:56.109433 containerd[1475]: time="2025-03-20T22:16:56.109409733Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 20 22:16:56.109481 containerd[1475]: time="2025-03-20T22:16:56.109443295Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 20 22:16:56.109481 containerd[1475]: time="2025-03-20T22:16:56.109473753Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.109535589Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.109556037Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.109795997Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.109815203Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.109829029Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.109839949Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.109922224Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.110126186Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.110154028Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.110167804Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 20 22:16:56.110768 containerd[1475]: time="2025-03-20T22:16:56.110198652Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 20 22:16:56.111032 containerd[1475]: time="2025-03-20T22:16:56.110409768Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 20 22:16:56.111032 containerd[1475]: time="2025-03-20T22:16:56.110465172Z" level=info msg="metadata content store policy set" policy=shared Mar 20 22:16:56.124420 containerd[1475]: time="2025-03-20T22:16:56.124383426Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 20 22:16:56.124474 containerd[1475]: time="2025-03-20T22:16:56.124442086Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 20 22:16:56.124474 containerd[1475]: time="2025-03-20T22:16:56.124460600Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 20 22:16:56.124537 containerd[1475]: time="2025-03-20T22:16:56.124485687Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 20 22:16:56.124537 containerd[1475]: time="2025-03-20T22:16:56.124503110Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 20 22:16:56.124537 containerd[1475]: time="2025-03-20T22:16:56.124523508Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 20 22:16:56.124602 containerd[1475]: time="2025-03-20T22:16:56.124541452Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 20 22:16:56.124602 containerd[1475]: time="2025-03-20T22:16:56.124556530Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 20 22:16:56.124602 containerd[1475]: time="2025-03-20T22:16:56.124568823Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 20 22:16:56.124602 containerd[1475]: time="2025-03-20T22:16:56.124581397Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 20 22:16:56.124602 containerd[1475]: time="2025-03-20T22:16:56.124593119Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 20 22:16:56.124711 containerd[1475]: time="2025-03-20T22:16:56.124606584Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 20 22:16:56.124795 containerd[1475]: time="2025-03-20T22:16:56.124743391Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 20 22:16:56.124829 containerd[1475]: time="2025-03-20T22:16:56.124793415Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 20 22:16:56.124829 containerd[1475]: time="2025-03-20T22:16:56.124809615Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 20 22:16:56.124895 containerd[1475]: time="2025-03-20T22:16:56.124833490Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 20 22:16:56.124895 containerd[1475]: time="2025-03-20T22:16:56.124847486Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 20 22:16:56.124895 containerd[1475]: time="2025-03-20T22:16:56.124859108Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 20 22:16:56.124895 containerd[1475]: time="2025-03-20T22:16:56.124871711Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 20 22:16:56.124895 containerd[1475]: time="2025-03-20T22:16:56.124883664Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 20 22:16:56.125235 containerd[1475]: time="2025-03-20T22:16:56.124896829Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 20 22:16:56.125235 containerd[1475]: time="2025-03-20T22:16:56.124910735Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 20 22:16:56.125235 containerd[1475]: time="2025-03-20T22:16:56.124922276Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 20 22:16:56.125235 containerd[1475]: time="2025-03-20T22:16:56.124979554Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 20 22:16:56.125235 containerd[1475]: time="2025-03-20T22:16:56.124995083Z" level=info msg="Start snapshots syncer" Mar 20 22:16:56.125235 containerd[1475]: time="2025-03-20T22:16:56.125016834Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 20 22:16:56.125601 containerd[1475]: time="2025-03-20T22:16:56.125267684Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 20 22:16:56.125601 containerd[1475]: time="2025-03-20T22:16:56.125325092Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125404000Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125496533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125521680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125539965Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125552969Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125565843Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125577255Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125588746Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125612210Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125626106Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125637207Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125672012Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125688543Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 22:16:56.126069 containerd[1475]: time="2025-03-20T22:16:56.125699875Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125713270Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125723018Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125734019Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125746051Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125779524Z" level=info msg="runtime interface created" Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125787950Z" level=info msg="created NRI interface" Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125797608Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125809570Z" level=info msg="Connect containerd service" Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.125835669Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 20 22:16:56.127006 containerd[1475]: time="2025-03-20T22:16:56.126389388Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 22:16:56.349725 containerd[1475]: time="2025-03-20T22:16:56.349432495Z" level=info msg="Start subscribing containerd event" Mar 20 22:16:56.349725 containerd[1475]: time="2025-03-20T22:16:56.349480625Z" level=info msg="Start recovering state" Mar 20 22:16:56.350035 containerd[1475]: time="2025-03-20T22:16:56.349663799Z" level=info msg="Start event monitor" Mar 20 22:16:56.350035 containerd[1475]: time="2025-03-20T22:16:56.349891506Z" level=info msg="Start cni network conf syncer for default" Mar 20 22:16:56.350035 containerd[1475]: time="2025-03-20T22:16:56.349902176Z" level=info msg="Start streaming server" Mar 20 22:16:56.350035 containerd[1475]: time="2025-03-20T22:16:56.349912095Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 20 22:16:56.350035 containerd[1475]: time="2025-03-20T22:16:56.349920170Z" level=info msg="runtime interface starting up..." Mar 20 22:16:56.350035 containerd[1475]: time="2025-03-20T22:16:56.349926612Z" level=info msg="starting plugins..." Mar 20 22:16:56.350035 containerd[1475]: time="2025-03-20T22:16:56.349962138Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 20 22:16:56.353803 containerd[1475]: time="2025-03-20T22:16:56.350354805Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 20 22:16:56.353803 containerd[1475]: time="2025-03-20T22:16:56.350426289Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 20 22:16:56.350618 systemd[1]: Started containerd.service - containerd container runtime. Mar 20 22:16:56.354439 containerd[1475]: time="2025-03-20T22:16:56.354411784Z" level=info msg="containerd successfully booted in 0.271390s" Mar 20 22:16:56.421269 tar[1465]: linux-amd64/LICENSE Mar 20 22:16:56.421269 tar[1465]: linux-amd64/README.md Mar 20 22:16:56.436989 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 20 22:16:56.470834 sshd_keygen[1486]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 20 22:16:56.491676 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 20 22:16:56.497009 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 20 22:16:56.501157 systemd[1]: Started sshd@0-172.24.4.53:22-172.24.4.1:60630.service - OpenSSH per-connection server daemon (172.24.4.1:60630). Mar 20 22:16:56.510593 systemd[1]: issuegen.service: Deactivated successfully. Mar 20 22:16:56.510813 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 20 22:16:56.518785 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 20 22:16:56.552146 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 20 22:16:56.556969 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 20 22:16:56.565093 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 20 22:16:56.567385 systemd[1]: Reached target getty.target - Login Prompts. Mar 20 22:16:56.875180 systemd-networkd[1384]: eth0: Gained IPv6LL Mar 20 22:16:56.876666 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Mar 20 22:16:56.879436 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 20 22:16:56.886826 systemd[1]: Reached target network-online.target - Network is Online. Mar 20 22:16:56.894699 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:16:56.904654 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 20 22:16:56.950980 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 20 22:16:57.545387 sshd[1546]: Accepted publickey for core from 172.24.4.1 port 60630 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:16:57.546653 sshd-session[1546]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:16:57.579360 systemd-logind[1460]: New session 1 of user core. Mar 20 22:16:57.579446 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 20 22:16:57.587104 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 20 22:16:57.616993 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 20 22:16:57.623100 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 20 22:16:57.640485 (systemd)[1570]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 20 22:16:57.645379 systemd-logind[1460]: New session c1 of user core. Mar 20 22:16:57.808152 systemd[1570]: Queued start job for default target default.target. Mar 20 22:16:57.812838 systemd[1570]: Created slice app.slice - User Application Slice. Mar 20 22:16:57.812864 systemd[1570]: Reached target paths.target - Paths. Mar 20 22:16:57.812901 systemd[1570]: Reached target timers.target - Timers. Mar 20 22:16:57.820922 systemd[1570]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 20 22:16:57.846045 systemd[1570]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 20 22:16:57.846268 systemd[1570]: Reached target sockets.target - Sockets. Mar 20 22:16:57.846380 systemd[1570]: Reached target basic.target - Basic System. Mar 20 22:16:57.846418 systemd[1570]: Reached target default.target - Main User Target. Mar 20 22:16:57.846443 systemd[1570]: Startup finished in 192ms. Mar 20 22:16:57.846582 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 20 22:16:57.853996 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 20 22:16:58.196171 systemd[1]: Started sshd@1-172.24.4.53:22-172.24.4.1:43396.service - OpenSSH per-connection server daemon (172.24.4.1:43396). Mar 20 22:16:58.773160 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:16:58.792637 (kubelet)[1589]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:16:59.610605 sshd[1581]: Accepted publickey for core from 172.24.4.1 port 43396 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:16:59.613705 sshd-session[1581]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:16:59.627685 systemd-logind[1460]: New session 2 of user core. Mar 20 22:16:59.635938 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 20 22:17:00.336385 kubelet[1589]: E0320 22:17:00.336265 1589 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:17:00.340203 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:17:00.340540 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:17:00.341233 systemd[1]: kubelet.service: Consumed 2.130s CPU time, 248.4M memory peak. Mar 20 22:17:00.423403 sshd[1595]: Connection closed by 172.24.4.1 port 43396 Mar 20 22:17:00.423171 sshd-session[1581]: pam_unix(sshd:session): session closed for user core Mar 20 22:17:00.439710 systemd[1]: sshd@1-172.24.4.53:22-172.24.4.1:43396.service: Deactivated successfully. Mar 20 22:17:00.443334 systemd[1]: session-2.scope: Deactivated successfully. Mar 20 22:17:00.447070 systemd-logind[1460]: Session 2 logged out. Waiting for processes to exit. Mar 20 22:17:00.450830 systemd[1]: Started sshd@2-172.24.4.53:22-172.24.4.1:43404.service - OpenSSH per-connection server daemon (172.24.4.1:43404). Mar 20 22:17:00.458749 systemd-logind[1460]: Removed session 2. Mar 20 22:17:01.608402 sshd[1603]: Accepted publickey for core from 172.24.4.1 port 43404 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:17:01.612572 sshd-session[1603]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:17:01.645281 systemd-logind[1460]: New session 3 of user core. Mar 20 22:17:01.653166 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 20 22:17:01.671515 login[1554]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 22:17:01.674191 login[1553]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 22:17:01.684886 systemd-logind[1460]: New session 4 of user core. Mar 20 22:17:01.698201 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 20 22:17:01.703990 systemd-logind[1460]: New session 5 of user core. Mar 20 22:17:01.711508 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 20 22:17:02.250568 sshd[1610]: Connection closed by 172.24.4.1 port 43404 Mar 20 22:17:02.251639 sshd-session[1603]: pam_unix(sshd:session): session closed for user core Mar 20 22:17:02.258324 systemd-logind[1460]: Session 3 logged out. Waiting for processes to exit. Mar 20 22:17:02.260297 systemd[1]: sshd@2-172.24.4.53:22-172.24.4.1:43404.service: Deactivated successfully. Mar 20 22:17:02.264653 systemd[1]: session-3.scope: Deactivated successfully. Mar 20 22:17:02.267827 systemd-logind[1460]: Removed session 3. Mar 20 22:17:02.476423 coreos-metadata[1444]: Mar 20 22:17:02.476 WARN failed to locate config-drive, using the metadata service API instead Mar 20 22:17:02.571139 coreos-metadata[1444]: Mar 20 22:17:02.570 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 20 22:17:02.757142 coreos-metadata[1444]: Mar 20 22:17:02.757 INFO Fetch successful Mar 20 22:17:02.757142 coreos-metadata[1444]: Mar 20 22:17:02.757 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 20 22:17:02.770693 coreos-metadata[1444]: Mar 20 22:17:02.770 INFO Fetch successful Mar 20 22:17:02.770693 coreos-metadata[1444]: Mar 20 22:17:02.770 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 20 22:17:02.783869 coreos-metadata[1444]: Mar 20 22:17:02.783 INFO Fetch successful Mar 20 22:17:02.783869 coreos-metadata[1444]: Mar 20 22:17:02.783 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 20 22:17:02.798189 coreos-metadata[1444]: Mar 20 22:17:02.798 INFO Fetch successful Mar 20 22:17:02.798189 coreos-metadata[1444]: Mar 20 22:17:02.798 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 20 22:17:02.811443 coreos-metadata[1444]: Mar 20 22:17:02.811 INFO Fetch successful Mar 20 22:17:02.811443 coreos-metadata[1444]: Mar 20 22:17:02.811 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 20 22:17:02.824912 coreos-metadata[1444]: Mar 20 22:17:02.824 INFO Fetch successful Mar 20 22:17:02.877099 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 20 22:17:02.878908 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 20 22:17:02.912432 coreos-metadata[1511]: Mar 20 22:17:02.912 WARN failed to locate config-drive, using the metadata service API instead Mar 20 22:17:02.954726 coreos-metadata[1511]: Mar 20 22:17:02.954 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 20 22:17:02.971920 coreos-metadata[1511]: Mar 20 22:17:02.971 INFO Fetch successful Mar 20 22:17:02.971920 coreos-metadata[1511]: Mar 20 22:17:02.971 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 20 22:17:02.985556 coreos-metadata[1511]: Mar 20 22:17:02.985 INFO Fetch successful Mar 20 22:17:02.991617 unknown[1511]: wrote ssh authorized keys file for user: core Mar 20 22:17:03.033297 update-ssh-keys[1646]: Updated "/home/core/.ssh/authorized_keys" Mar 20 22:17:03.034478 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 20 22:17:03.038573 systemd[1]: Finished sshkeys.service. Mar 20 22:17:03.043325 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 20 22:17:03.043644 systemd[1]: Startup finished in 1.218s (kernel) + 15.200s (initrd) + 11.092s (userspace) = 27.511s. Mar 20 22:17:10.592022 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 20 22:17:10.595235 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:17:10.930137 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:17:10.948375 (kubelet)[1658]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:17:11.021166 kubelet[1658]: E0320 22:17:11.021111 1658 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:17:11.024752 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:17:11.025002 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:17:11.025443 systemd[1]: kubelet.service: Consumed 276ms CPU time, 96.1M memory peak. Mar 20 22:17:12.275828 systemd[1]: Started sshd@3-172.24.4.53:22-172.24.4.1:40870.service - OpenSSH per-connection server daemon (172.24.4.1:40870). Mar 20 22:17:13.715408 sshd[1667]: Accepted publickey for core from 172.24.4.1 port 40870 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:17:13.717999 sshd-session[1667]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:17:13.729864 systemd-logind[1460]: New session 6 of user core. Mar 20 22:17:13.733067 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 20 22:17:14.410749 sshd[1669]: Connection closed by 172.24.4.1 port 40870 Mar 20 22:17:14.411960 sshd-session[1667]: pam_unix(sshd:session): session closed for user core Mar 20 22:17:14.429579 systemd[1]: sshd@3-172.24.4.53:22-172.24.4.1:40870.service: Deactivated successfully. Mar 20 22:17:14.432641 systemd[1]: session-6.scope: Deactivated successfully. Mar 20 22:17:14.436225 systemd-logind[1460]: Session 6 logged out. Waiting for processes to exit. Mar 20 22:17:14.439332 systemd[1]: Started sshd@4-172.24.4.53:22-172.24.4.1:46706.service - OpenSSH per-connection server daemon (172.24.4.1:46706). Mar 20 22:17:14.442595 systemd-logind[1460]: Removed session 6. Mar 20 22:17:15.731936 sshd[1674]: Accepted publickey for core from 172.24.4.1 port 46706 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:17:15.734573 sshd-session[1674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:17:15.746362 systemd-logind[1460]: New session 7 of user core. Mar 20 22:17:15.756104 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 20 22:17:16.344415 sshd[1677]: Connection closed by 172.24.4.1 port 46706 Mar 20 22:17:16.344274 sshd-session[1674]: pam_unix(sshd:session): session closed for user core Mar 20 22:17:16.361651 systemd[1]: sshd@4-172.24.4.53:22-172.24.4.1:46706.service: Deactivated successfully. Mar 20 22:17:16.364930 systemd[1]: session-7.scope: Deactivated successfully. Mar 20 22:17:16.366545 systemd-logind[1460]: Session 7 logged out. Waiting for processes to exit. Mar 20 22:17:16.370510 systemd[1]: Started sshd@5-172.24.4.53:22-172.24.4.1:46716.service - OpenSSH per-connection server daemon (172.24.4.1:46716). Mar 20 22:17:16.372978 systemd-logind[1460]: Removed session 7. Mar 20 22:17:17.841913 sshd[1682]: Accepted publickey for core from 172.24.4.1 port 46716 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:17:17.845235 sshd-session[1682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:17:17.856256 systemd-logind[1460]: New session 8 of user core. Mar 20 22:17:17.866057 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 20 22:17:18.381048 sshd[1685]: Connection closed by 172.24.4.1 port 46716 Mar 20 22:17:18.382045 sshd-session[1682]: pam_unix(sshd:session): session closed for user core Mar 20 22:17:18.396527 systemd[1]: sshd@5-172.24.4.53:22-172.24.4.1:46716.service: Deactivated successfully. Mar 20 22:17:18.399620 systemd[1]: session-8.scope: Deactivated successfully. Mar 20 22:17:18.403195 systemd-logind[1460]: Session 8 logged out. Waiting for processes to exit. Mar 20 22:17:18.406452 systemd[1]: Started sshd@6-172.24.4.53:22-172.24.4.1:46724.service - OpenSSH per-connection server daemon (172.24.4.1:46724). Mar 20 22:17:18.409205 systemd-logind[1460]: Removed session 8. Mar 20 22:17:19.585582 sshd[1690]: Accepted publickey for core from 172.24.4.1 port 46724 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:17:19.588741 sshd-session[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:17:19.601878 systemd-logind[1460]: New session 9 of user core. Mar 20 22:17:19.610170 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 20 22:17:20.076045 sudo[1694]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 20 22:17:20.076667 sudo[1694]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:17:20.097997 sudo[1694]: pam_unix(sudo:session): session closed for user root Mar 20 22:17:20.334929 sshd[1693]: Connection closed by 172.24.4.1 port 46724 Mar 20 22:17:20.336804 sshd-session[1690]: pam_unix(sshd:session): session closed for user core Mar 20 22:17:20.357569 systemd[1]: sshd@6-172.24.4.53:22-172.24.4.1:46724.service: Deactivated successfully. Mar 20 22:17:20.361581 systemd[1]: session-9.scope: Deactivated successfully. Mar 20 22:17:20.366148 systemd-logind[1460]: Session 9 logged out. Waiting for processes to exit. Mar 20 22:17:20.369445 systemd[1]: Started sshd@7-172.24.4.53:22-172.24.4.1:46740.service - OpenSSH per-connection server daemon (172.24.4.1:46740). Mar 20 22:17:20.374020 systemd-logind[1460]: Removed session 9. Mar 20 22:17:21.246627 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 20 22:17:21.250847 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:17:21.515271 sshd[1699]: Accepted publickey for core from 172.24.4.1 port 46740 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:17:21.519850 sshd-session[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:17:21.539840 systemd-logind[1460]: New session 10 of user core. Mar 20 22:17:21.546417 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 20 22:17:21.595021 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:17:21.606133 (kubelet)[1710]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:17:21.649856 kubelet[1710]: E0320 22:17:21.649818 1710 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:17:21.653894 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:17:21.654177 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:17:21.654949 systemd[1]: kubelet.service: Consumed 238ms CPU time, 95.6M memory peak. Mar 20 22:17:21.942917 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 20 22:17:21.943557 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:17:21.951632 sudo[1720]: pam_unix(sudo:session): session closed for user root Mar 20 22:17:21.962845 sudo[1719]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 20 22:17:21.963490 sudo[1719]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:17:21.984006 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 22:17:22.058436 augenrules[1742]: No rules Mar 20 22:17:22.059503 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 22:17:22.059983 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 22:17:22.062634 sudo[1719]: pam_unix(sudo:session): session closed for user root Mar 20 22:17:22.211613 sshd[1705]: Connection closed by 172.24.4.1 port 46740 Mar 20 22:17:22.211200 sshd-session[1699]: pam_unix(sshd:session): session closed for user core Mar 20 22:17:22.228044 systemd[1]: sshd@7-172.24.4.53:22-172.24.4.1:46740.service: Deactivated successfully. Mar 20 22:17:22.231336 systemd[1]: session-10.scope: Deactivated successfully. Mar 20 22:17:22.233145 systemd-logind[1460]: Session 10 logged out. Waiting for processes to exit. Mar 20 22:17:22.237152 systemd[1]: Started sshd@8-172.24.4.53:22-172.24.4.1:46748.service - OpenSSH per-connection server daemon (172.24.4.1:46748). Mar 20 22:17:22.239382 systemd-logind[1460]: Removed session 10. Mar 20 22:17:23.528337 sshd[1750]: Accepted publickey for core from 172.24.4.1 port 46748 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:17:23.530844 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:17:23.542290 systemd-logind[1460]: New session 11 of user core. Mar 20 22:17:23.553058 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 20 22:17:23.958889 sudo[1754]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 20 22:17:23.959542 sudo[1754]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:17:25.135646 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 20 22:17:25.154623 (dockerd)[1773]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 20 22:17:25.692711 dockerd[1773]: time="2025-03-20T22:17:25.692391878Z" level=info msg="Starting up" Mar 20 22:17:25.694805 dockerd[1773]: time="2025-03-20T22:17:25.694694947Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 20 22:17:25.899290 systemd[1]: var-lib-docker-metacopy\x2dcheck3434159133-merged.mount: Deactivated successfully. Mar 20 22:17:25.945828 dockerd[1773]: time="2025-03-20T22:17:25.945613154Z" level=info msg="Loading containers: start." Mar 20 22:17:26.228838 kernel: Initializing XFRM netlink socket Mar 20 22:17:26.234358 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Mar 20 22:17:27.109335 systemd-resolved[1387]: Clock change detected. Flushing caches. Mar 20 22:17:27.109937 systemd-timesyncd[1390]: Contacted time server 162.159.200.123:123 (2.flatcar.pool.ntp.org). Mar 20 22:17:27.110021 systemd-timesyncd[1390]: Initial clock synchronization to Thu 2025-03-20 22:17:27.109153 UTC. Mar 20 22:17:27.178848 systemd-networkd[1384]: docker0: Link UP Mar 20 22:17:27.244577 dockerd[1773]: time="2025-03-20T22:17:27.244520557Z" level=info msg="Loading containers: done." Mar 20 22:17:27.263066 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1302145653-merged.mount: Deactivated successfully. Mar 20 22:17:27.274160 dockerd[1773]: time="2025-03-20T22:17:27.274037055Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 20 22:17:27.274160 dockerd[1773]: time="2025-03-20T22:17:27.274119590Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 20 22:17:27.274449 dockerd[1773]: time="2025-03-20T22:17:27.274215129Z" level=info msg="Daemon has completed initialization" Mar 20 22:17:27.349260 dockerd[1773]: time="2025-03-20T22:17:27.348864013Z" level=info msg="API listen on /run/docker.sock" Mar 20 22:17:27.350032 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 20 22:17:29.454394 containerd[1475]: time="2025-03-20T22:17:29.454318743Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 20 22:17:30.149626 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2979614871.mount: Deactivated successfully. Mar 20 22:17:32.072428 containerd[1475]: time="2025-03-20T22:17:32.072369867Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:32.074023 containerd[1475]: time="2025-03-20T22:17:32.073809807Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674581" Mar 20 22:17:32.076673 containerd[1475]: time="2025-03-20T22:17:32.076642119Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:32.080837 containerd[1475]: time="2025-03-20T22:17:32.080783767Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:32.081961 containerd[1475]: time="2025-03-20T22:17:32.081802758Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 2.627404296s" Mar 20 22:17:32.081961 containerd[1475]: time="2025-03-20T22:17:32.081841892Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 20 22:17:32.102934 containerd[1475]: time="2025-03-20T22:17:32.102752229Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 20 22:17:32.597142 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 20 22:17:32.600277 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:17:32.764511 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:17:32.773779 (kubelet)[2046]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:17:32.892351 kubelet[2046]: E0320 22:17:32.892251 2046 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:17:32.895124 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:17:32.895273 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:17:32.895833 systemd[1]: kubelet.service: Consumed 187ms CPU time, 95.6M memory peak. Mar 20 22:17:34.333921 containerd[1475]: time="2025-03-20T22:17:34.333728054Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:34.335329 containerd[1475]: time="2025-03-20T22:17:34.335110005Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619780" Mar 20 22:17:34.336754 containerd[1475]: time="2025-03-20T22:17:34.336689337Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:34.339822 containerd[1475]: time="2025-03-20T22:17:34.339773382Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:34.341199 containerd[1475]: time="2025-03-20T22:17:34.340813602Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 2.238026508s" Mar 20 22:17:34.341199 containerd[1475]: time="2025-03-20T22:17:34.340857084Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 20 22:17:34.360185 containerd[1475]: time="2025-03-20T22:17:34.360079025Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 20 22:17:36.428196 containerd[1475]: time="2025-03-20T22:17:36.427141320Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:36.430437 containerd[1475]: time="2025-03-20T22:17:36.430371798Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903317" Mar 20 22:17:36.432380 containerd[1475]: time="2025-03-20T22:17:36.432350930Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:36.435257 containerd[1475]: time="2025-03-20T22:17:36.435211916Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:36.436444 containerd[1475]: time="2025-03-20T22:17:36.436406476Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 2.076290222s" Mar 20 22:17:36.436566 containerd[1475]: time="2025-03-20T22:17:36.436547771Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 20 22:17:36.456872 containerd[1475]: time="2025-03-20T22:17:36.456835411Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 20 22:17:38.022125 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3954877286.mount: Deactivated successfully. Mar 20 22:17:38.797712 containerd[1475]: time="2025-03-20T22:17:38.797542597Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:38.801140 containerd[1475]: time="2025-03-20T22:17:38.801009819Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185380" Mar 20 22:17:38.815189 containerd[1475]: time="2025-03-20T22:17:38.815012922Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:38.819816 containerd[1475]: time="2025-03-20T22:17:38.819649728Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:38.822194 containerd[1475]: time="2025-03-20T22:17:38.821355478Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 2.364466937s" Mar 20 22:17:38.822194 containerd[1475]: time="2025-03-20T22:17:38.821428224Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 20 22:17:38.864852 containerd[1475]: time="2025-03-20T22:17:38.864681686Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 20 22:17:41.285060 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount938466857.mount: Deactivated successfully. Mar 20 22:17:41.398523 update_engine[1461]: I20250320 22:17:41.397070 1461 update_attempter.cc:509] Updating boot flags... Mar 20 22:17:41.487533 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2111) Mar 20 22:17:41.641059 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2110) Mar 20 22:17:41.683494 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2110) Mar 20 22:17:43.097731 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 20 22:17:43.102956 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:17:43.284135 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:17:43.293875 (kubelet)[2159]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:17:43.345910 kubelet[2159]: E0320 22:17:43.345872 2159 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:17:43.349978 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:17:43.350287 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:17:43.351114 systemd[1]: kubelet.service: Consumed 205ms CPU time, 97.2M memory peak. Mar 20 22:17:44.267012 containerd[1475]: time="2025-03-20T22:17:44.266027849Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:44.269332 containerd[1475]: time="2025-03-20T22:17:44.269189549Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Mar 20 22:17:44.271769 containerd[1475]: time="2025-03-20T22:17:44.271705818Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:44.278432 containerd[1475]: time="2025-03-20T22:17:44.278291139Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:44.281322 containerd[1475]: time="2025-03-20T22:17:44.281254276Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 5.416490166s" Mar 20 22:17:44.281763 containerd[1475]: time="2025-03-20T22:17:44.281535233Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 20 22:17:44.326770 containerd[1475]: time="2025-03-20T22:17:44.326700230Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 20 22:17:44.884670 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount923033355.mount: Deactivated successfully. Mar 20 22:17:44.897253 containerd[1475]: time="2025-03-20T22:17:44.897190986Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:44.899170 containerd[1475]: time="2025-03-20T22:17:44.899024244Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Mar 20 22:17:44.900635 containerd[1475]: time="2025-03-20T22:17:44.900583829Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:44.907372 containerd[1475]: time="2025-03-20T22:17:44.907321816Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:44.909980 containerd[1475]: time="2025-03-20T22:17:44.909092697Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 582.086534ms" Mar 20 22:17:44.909980 containerd[1475]: time="2025-03-20T22:17:44.909164752Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 20 22:17:44.949756 containerd[1475]: time="2025-03-20T22:17:44.949697812Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 20 22:17:45.578572 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1135785493.mount: Deactivated successfully. Mar 20 22:17:48.627609 containerd[1475]: time="2025-03-20T22:17:48.626894844Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:48.630528 containerd[1475]: time="2025-03-20T22:17:48.630383507Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" Mar 20 22:17:48.632538 containerd[1475]: time="2025-03-20T22:17:48.632382296Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:48.639232 containerd[1475]: time="2025-03-20T22:17:48.639178342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:17:48.642300 containerd[1475]: time="2025-03-20T22:17:48.642008029Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 3.692267146s" Mar 20 22:17:48.642300 containerd[1475]: time="2025-03-20T22:17:48.642077309Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 20 22:17:52.603451 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:17:52.603765 systemd[1]: kubelet.service: Consumed 205ms CPU time, 97.2M memory peak. Mar 20 22:17:52.606524 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:17:52.632818 systemd[1]: Reload requested from client PID 2317 ('systemctl') (unit session-11.scope)... Mar 20 22:17:52.632833 systemd[1]: Reloading... Mar 20 22:17:52.758676 zram_generator::config[2366]: No configuration found. Mar 20 22:17:53.167803 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:17:53.307340 systemd[1]: Reloading finished in 674 ms. Mar 20 22:17:53.371147 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 22:17:53.371430 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:17:53.371470 systemd[1]: kubelet.service: Consumed 105ms CPU time, 83.6M memory peak. Mar 20 22:17:53.374001 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:17:53.516922 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:17:53.526724 (kubelet)[2431]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 22:17:54.041743 kubelet[2431]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:17:54.041743 kubelet[2431]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 22:17:54.041743 kubelet[2431]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:17:54.041743 kubelet[2431]: I0320 22:17:54.011738 2431 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 22:17:54.748460 kubelet[2431]: I0320 22:17:54.748415 2431 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 20 22:17:54.748460 kubelet[2431]: I0320 22:17:54.748445 2431 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 22:17:54.749331 kubelet[2431]: I0320 22:17:54.749293 2431 server.go:927] "Client rotation is on, will bootstrap in background" Mar 20 22:17:54.783524 kubelet[2431]: I0320 22:17:54.782426 2431 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 22:17:54.783703 kubelet[2431]: E0320 22:17:54.783667 2431 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.53:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.802543 kubelet[2431]: I0320 22:17:54.802519 2431 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 22:17:54.803379 kubelet[2431]: I0320 22:17:54.802912 2431 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 22:17:54.803379 kubelet[2431]: I0320 22:17:54.802946 2431 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-2-b-c50fddf147.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 20 22:17:54.803379 kubelet[2431]: I0320 22:17:54.803139 2431 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 22:17:54.803379 kubelet[2431]: I0320 22:17:54.803150 2431 container_manager_linux.go:301] "Creating device plugin manager" Mar 20 22:17:54.803596 kubelet[2431]: I0320 22:17:54.803259 2431 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:17:54.804739 kubelet[2431]: I0320 22:17:54.804727 2431 kubelet.go:400] "Attempting to sync node with API server" Mar 20 22:17:54.805151 kubelet[2431]: I0320 22:17:54.805135 2431 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 22:17:54.805267 kubelet[2431]: I0320 22:17:54.805248 2431 kubelet.go:312] "Adding apiserver pod source" Mar 20 22:17:54.805359 kubelet[2431]: I0320 22:17:54.805345 2431 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 22:17:54.811930 kubelet[2431]: W0320 22:17:54.805208 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.53:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-b-c50fddf147.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.812108 kubelet[2431]: E0320 22:17:54.812090 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.53:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-b-c50fddf147.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.813894 kubelet[2431]: W0320 22:17:54.813820 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.53:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.813894 kubelet[2431]: E0320 22:17:54.813875 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.53:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.814593 kubelet[2431]: I0320 22:17:54.814319 2431 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 22:17:54.817510 kubelet[2431]: I0320 22:17:54.816687 2431 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 22:17:54.817510 kubelet[2431]: W0320 22:17:54.816740 2431 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 20 22:17:54.817993 kubelet[2431]: I0320 22:17:54.817976 2431 server.go:1264] "Started kubelet" Mar 20 22:17:54.819723 kubelet[2431]: I0320 22:17:54.819703 2431 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 22:17:54.825387 kubelet[2431]: I0320 22:17:54.825339 2431 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 22:17:54.825827 kubelet[2431]: E0320 22:17:54.825698 2431 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.53:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.53:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-9999-0-2-b-c50fddf147.novalocal.182ea2c5766b2950 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-2-b-c50fddf147.novalocal,UID:ci-9999-0-2-b-c50fddf147.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-0-2-b-c50fddf147.novalocal,},FirstTimestamp:2025-03-20 22:17:54.81795208 +0000 UTC m=+1.285673772,LastTimestamp:2025-03-20 22:17:54.81795208 +0000 UTC m=+1.285673772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-2-b-c50fddf147.novalocal,}" Mar 20 22:17:54.826802 kubelet[2431]: I0320 22:17:54.826775 2431 server.go:455] "Adding debug handlers to kubelet server" Mar 20 22:17:54.828140 kubelet[2431]: I0320 22:17:54.828026 2431 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 22:17:54.828374 kubelet[2431]: I0320 22:17:54.828275 2431 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 22:17:54.831082 kubelet[2431]: I0320 22:17:54.829934 2431 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 20 22:17:54.831082 kubelet[2431]: I0320 22:17:54.830089 2431 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 20 22:17:54.831082 kubelet[2431]: I0320 22:17:54.830150 2431 reconciler.go:26] "Reconciler: start to sync state" Mar 20 22:17:54.832418 kubelet[2431]: I0320 22:17:54.832399 2431 factory.go:221] Registration of the systemd container factory successfully Mar 20 22:17:54.832607 kubelet[2431]: I0320 22:17:54.832589 2431 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 22:17:54.833098 kubelet[2431]: W0320 22:17:54.833062 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.833189 kubelet[2431]: E0320 22:17:54.833178 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.833329 kubelet[2431]: E0320 22:17:54.833304 2431 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.53:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-2-b-c50fddf147.novalocal?timeout=10s\": dial tcp 172.24.4.53:6443: connect: connection refused" interval="200ms" Mar 20 22:17:54.834759 kubelet[2431]: I0320 22:17:54.834739 2431 factory.go:221] Registration of the containerd container factory successfully Mar 20 22:17:54.841813 kubelet[2431]: I0320 22:17:54.840981 2431 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 22:17:54.842205 kubelet[2431]: I0320 22:17:54.842180 2431 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 22:17:54.842205 kubelet[2431]: I0320 22:17:54.842208 2431 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 22:17:54.842297 kubelet[2431]: I0320 22:17:54.842225 2431 kubelet.go:2337] "Starting kubelet main sync loop" Mar 20 22:17:54.842297 kubelet[2431]: E0320 22:17:54.842259 2431 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 22:17:54.849413 kubelet[2431]: W0320 22:17:54.849348 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.53:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.849413 kubelet[2431]: E0320 22:17:54.849419 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.53:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:54.860685 kubelet[2431]: I0320 22:17:54.860650 2431 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 22:17:54.860685 kubelet[2431]: I0320 22:17:54.860673 2431 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 22:17:54.860685 kubelet[2431]: I0320 22:17:54.860692 2431 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:17:54.869715 kubelet[2431]: I0320 22:17:54.869676 2431 policy_none.go:49] "None policy: Start" Mar 20 22:17:54.870351 kubelet[2431]: I0320 22:17:54.870320 2431 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 22:17:54.870351 kubelet[2431]: I0320 22:17:54.870343 2431 state_mem.go:35] "Initializing new in-memory state store" Mar 20 22:17:54.879576 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 20 22:17:54.897888 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 20 22:17:54.907737 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 20 22:17:54.921849 kubelet[2431]: I0320 22:17:54.920953 2431 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 22:17:54.921849 kubelet[2431]: I0320 22:17:54.921272 2431 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 22:17:54.921849 kubelet[2431]: I0320 22:17:54.921514 2431 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 22:17:54.926284 kubelet[2431]: E0320 22:17:54.926214 2431 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-9999-0-2-b-c50fddf147.novalocal\" not found" Mar 20 22:17:54.934944 kubelet[2431]: I0320 22:17:54.934469 2431 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:54.936191 kubelet[2431]: E0320 22:17:54.936066 2431 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.53:6443/api/v1/nodes\": dial tcp 172.24.4.53:6443: connect: connection refused" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:54.943028 kubelet[2431]: I0320 22:17:54.942911 2431 topology_manager.go:215] "Topology Admit Handler" podUID="aef60593acf6c616e939778c1a5c8520" podNamespace="kube-system" podName="kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:54.947068 kubelet[2431]: I0320 22:17:54.946702 2431 topology_manager.go:215] "Topology Admit Handler" podUID="8998d043404daac6519ecba4abea29d9" podNamespace="kube-system" podName="kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:54.949808 kubelet[2431]: I0320 22:17:54.949760 2431 topology_manager.go:215] "Topology Admit Handler" podUID="d38092c0549ca8268027a516ffa5d736" podNamespace="kube-system" podName="kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:54.966916 systemd[1]: Created slice kubepods-burstable-podaef60593acf6c616e939778c1a5c8520.slice - libcontainer container kubepods-burstable-podaef60593acf6c616e939778c1a5c8520.slice. Mar 20 22:17:54.991437 systemd[1]: Created slice kubepods-burstable-pod8998d043404daac6519ecba4abea29d9.slice - libcontainer container kubepods-burstable-pod8998d043404daac6519ecba4abea29d9.slice. Mar 20 22:17:55.004388 systemd[1]: Created slice kubepods-burstable-podd38092c0549ca8268027a516ffa5d736.slice - libcontainer container kubepods-burstable-podd38092c0549ca8268027a516ffa5d736.slice. Mar 20 22:17:55.035073 kubelet[2431]: E0320 22:17:55.035004 2431 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.53:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-2-b-c50fddf147.novalocal?timeout=10s\": dial tcp 172.24.4.53:6443: connect: connection refused" interval="400ms" Mar 20 22:17:55.131653 kubelet[2431]: I0320 22:17:55.131591 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132402 kubelet[2431]: I0320 22:17:55.132275 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d38092c0549ca8268027a516ffa5d736-kubeconfig\") pod \"kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"d38092c0549ca8268027a516ffa5d736\") " pod="kube-system/kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132402 kubelet[2431]: I0320 22:17:55.132358 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aef60593acf6c616e939778c1a5c8520-ca-certs\") pod \"kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"aef60593acf6c616e939778c1a5c8520\") " pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132646 kubelet[2431]: I0320 22:17:55.132404 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aef60593acf6c616e939778c1a5c8520-k8s-certs\") pod \"kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"aef60593acf6c616e939778c1a5c8520\") " pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132646 kubelet[2431]: I0320 22:17:55.132450 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132646 kubelet[2431]: I0320 22:17:55.132567 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132842 kubelet[2431]: I0320 22:17:55.132659 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132842 kubelet[2431]: I0320 22:17:55.132772 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aef60593acf6c616e939778c1a5c8520-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"aef60593acf6c616e939778c1a5c8520\") " pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.132963 kubelet[2431]: I0320 22:17:55.132861 2431 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-ca-certs\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.139692 kubelet[2431]: I0320 22:17:55.139368 2431 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.140369 kubelet[2431]: E0320 22:17:55.140261 2431 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.53:6443/api/v1/nodes\": dial tcp 172.24.4.53:6443: connect: connection refused" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.287830 containerd[1475]: time="2025-03-20T22:17:55.287559509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal,Uid:aef60593acf6c616e939778c1a5c8520,Namespace:kube-system,Attempt:0,}" Mar 20 22:17:55.299113 containerd[1475]: time="2025-03-20T22:17:55.298930716Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal,Uid:8998d043404daac6519ecba4abea29d9,Namespace:kube-system,Attempt:0,}" Mar 20 22:17:55.316749 containerd[1475]: time="2025-03-20T22:17:55.316684212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal,Uid:d38092c0549ca8268027a516ffa5d736,Namespace:kube-system,Attempt:0,}" Mar 20 22:17:55.436417 kubelet[2431]: E0320 22:17:55.436078 2431 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.53:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-2-b-c50fddf147.novalocal?timeout=10s\": dial tcp 172.24.4.53:6443: connect: connection refused" interval="800ms" Mar 20 22:17:55.544531 kubelet[2431]: I0320 22:17:55.543459 2431 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.544531 kubelet[2431]: E0320 22:17:55.544185 2431 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.53:6443/api/v1/nodes\": dial tcp 172.24.4.53:6443: connect: connection refused" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:55.772653 kubelet[2431]: W0320 22:17:55.772540 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.53:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:55.772962 kubelet[2431]: E0320 22:17:55.772919 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.53:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:55.901175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1940196207.mount: Deactivated successfully. Mar 20 22:17:55.913437 containerd[1475]: time="2025-03-20T22:17:55.913195118Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:17:55.917763 containerd[1475]: time="2025-03-20T22:17:55.917615910Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 20 22:17:55.921322 containerd[1475]: time="2025-03-20T22:17:55.920946536Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:17:55.923628 containerd[1475]: time="2025-03-20T22:17:55.923552564Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:17:55.926674 containerd[1475]: time="2025-03-20T22:17:55.926553212Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:17:55.928422 containerd[1475]: time="2025-03-20T22:17:55.928304005Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 20 22:17:55.931794 containerd[1475]: time="2025-03-20T22:17:55.931692430Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 20 22:17:55.933571 containerd[1475]: time="2025-03-20T22:17:55.933441511Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:17:55.936361 containerd[1475]: time="2025-03-20T22:17:55.935241877Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 640.979917ms" Mar 20 22:17:55.944438 containerd[1475]: time="2025-03-20T22:17:55.943877724Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 617.315415ms" Mar 20 22:17:55.953046 containerd[1475]: time="2025-03-20T22:17:55.952765433Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 650.198838ms" Mar 20 22:17:55.978071 containerd[1475]: time="2025-03-20T22:17:55.977971947Z" level=info msg="connecting to shim a7ec6eb60bfd89609ba68ebf81100b0890d97dd5c7f1436db7e1b4203df4ae72" address="unix:///run/containerd/s/334d26f8bf7f3b00bda0e4fd85e870aa8d62f01c48231a428ae9faf725d12c35" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:17:56.002030 kubelet[2431]: W0320 22:17:56.001410 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.53:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-b-c50fddf147.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:56.002030 kubelet[2431]: E0320 22:17:56.001578 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.53:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-b-c50fddf147.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:56.017540 containerd[1475]: time="2025-03-20T22:17:56.016918051Z" level=info msg="connecting to shim f301eb381fc8f573bde6e03a0e28a74b3925cf4fe056fd607564edfd03115e03" address="unix:///run/containerd/s/f95350b5e8abbc03c996c3aad956846e3cdebc80dd6bc9ace5192de4e2b2d865" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:17:56.022256 kubelet[2431]: W0320 22:17:56.021208 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.53:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:56.022256 kubelet[2431]: E0320 22:17:56.021526 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.53:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:56.025957 systemd[1]: Started cri-containerd-a7ec6eb60bfd89609ba68ebf81100b0890d97dd5c7f1436db7e1b4203df4ae72.scope - libcontainer container a7ec6eb60bfd89609ba68ebf81100b0890d97dd5c7f1436db7e1b4203df4ae72. Mar 20 22:17:56.043256 containerd[1475]: time="2025-03-20T22:17:56.043194112Z" level=info msg="connecting to shim 752f68855ba837eb1cc56bce2aa1688a989fa13cbd89eedcd600f8702cd40034" address="unix:///run/containerd/s/f996ba48ca627afcc8ea446efb3ec68b5f88edc8b8ec2ceab1062ba836c305ab" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:17:56.059728 systemd[1]: Started cri-containerd-f301eb381fc8f573bde6e03a0e28a74b3925cf4fe056fd607564edfd03115e03.scope - libcontainer container f301eb381fc8f573bde6e03a0e28a74b3925cf4fe056fd607564edfd03115e03. Mar 20 22:17:56.075807 systemd[1]: Started cri-containerd-752f68855ba837eb1cc56bce2aa1688a989fa13cbd89eedcd600f8702cd40034.scope - libcontainer container 752f68855ba837eb1cc56bce2aa1688a989fa13cbd89eedcd600f8702cd40034. Mar 20 22:17:56.134412 containerd[1475]: time="2025-03-20T22:17:56.134358009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal,Uid:aef60593acf6c616e939778c1a5c8520,Namespace:kube-system,Attempt:0,} returns sandbox id \"a7ec6eb60bfd89609ba68ebf81100b0890d97dd5c7f1436db7e1b4203df4ae72\"" Mar 20 22:17:56.140600 containerd[1475]: time="2025-03-20T22:17:56.139934789Z" level=info msg="CreateContainer within sandbox \"a7ec6eb60bfd89609ba68ebf81100b0890d97dd5c7f1436db7e1b4203df4ae72\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 20 22:17:56.143129 kubelet[2431]: W0320 22:17:56.143070 2431 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:56.143129 kubelet[2431]: E0320 22:17:56.143133 2431 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.53:6443: connect: connection refused Mar 20 22:17:56.166875 containerd[1475]: time="2025-03-20T22:17:56.165325339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal,Uid:d38092c0549ca8268027a516ffa5d736,Namespace:kube-system,Attempt:0,} returns sandbox id \"f301eb381fc8f573bde6e03a0e28a74b3925cf4fe056fd607564edfd03115e03\"" Mar 20 22:17:56.168676 containerd[1475]: time="2025-03-20T22:17:56.168635537Z" level=info msg="CreateContainer within sandbox \"f301eb381fc8f573bde6e03a0e28a74b3925cf4fe056fd607564edfd03115e03\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 20 22:17:56.237827 kubelet[2431]: E0320 22:17:56.237709 2431 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.53:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-2-b-c50fddf147.novalocal?timeout=10s\": dial tcp 172.24.4.53:6443: connect: connection refused" interval="1.6s" Mar 20 22:17:56.275569 containerd[1475]: time="2025-03-20T22:17:56.275466679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal,Uid:8998d043404daac6519ecba4abea29d9,Namespace:kube-system,Attempt:0,} returns sandbox id \"752f68855ba837eb1cc56bce2aa1688a989fa13cbd89eedcd600f8702cd40034\"" Mar 20 22:17:56.285548 containerd[1475]: time="2025-03-20T22:17:56.285427130Z" level=info msg="CreateContainer within sandbox \"752f68855ba837eb1cc56bce2aa1688a989fa13cbd89eedcd600f8702cd40034\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 20 22:17:56.347555 kubelet[2431]: I0320 22:17:56.347439 2431 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:56.348214 kubelet[2431]: E0320 22:17:56.348163 2431 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.53:6443/api/v1/nodes\": dial tcp 172.24.4.53:6443: connect: connection refused" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:56.419174 containerd[1475]: time="2025-03-20T22:17:56.418871785Z" level=info msg="Container d3c6e08b311d5e8dbd22677f37d534abacb4a878b6e2efd2ecb8edb9c37e05a5: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:17:56.426536 containerd[1475]: time="2025-03-20T22:17:56.426398682Z" level=info msg="Container ec85359ee40556d3bf5f89b5f663fa62deb52332aabf201c91d30bd149b0ca5f: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:17:56.432978 containerd[1475]: time="2025-03-20T22:17:56.432892791Z" level=info msg="Container 152748c20ade808bb8aedd9d5c5c38b06678186adac5431727e421c124ae0574: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:17:56.446326 containerd[1475]: time="2025-03-20T22:17:56.446039799Z" level=info msg="CreateContainer within sandbox \"f301eb381fc8f573bde6e03a0e28a74b3925cf4fe056fd607564edfd03115e03\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d3c6e08b311d5e8dbd22677f37d534abacb4a878b6e2efd2ecb8edb9c37e05a5\"" Mar 20 22:17:56.447790 containerd[1475]: time="2025-03-20T22:17:56.447322053Z" level=info msg="StartContainer for \"d3c6e08b311d5e8dbd22677f37d534abacb4a878b6e2efd2ecb8edb9c37e05a5\"" Mar 20 22:17:56.452311 containerd[1475]: time="2025-03-20T22:17:56.451867629Z" level=info msg="connecting to shim d3c6e08b311d5e8dbd22677f37d534abacb4a878b6e2efd2ecb8edb9c37e05a5" address="unix:///run/containerd/s/f95350b5e8abbc03c996c3aad956846e3cdebc80dd6bc9ace5192de4e2b2d865" protocol=ttrpc version=3 Mar 20 22:17:56.474566 containerd[1475]: time="2025-03-20T22:17:56.470761915Z" level=info msg="CreateContainer within sandbox \"a7ec6eb60bfd89609ba68ebf81100b0890d97dd5c7f1436db7e1b4203df4ae72\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ec85359ee40556d3bf5f89b5f663fa62deb52332aabf201c91d30bd149b0ca5f\"" Mar 20 22:17:56.476395 containerd[1475]: time="2025-03-20T22:17:56.476294561Z" level=info msg="StartContainer for \"ec85359ee40556d3bf5f89b5f663fa62deb52332aabf201c91d30bd149b0ca5f\"" Mar 20 22:17:56.495855 containerd[1475]: time="2025-03-20T22:17:56.495802048Z" level=info msg="connecting to shim ec85359ee40556d3bf5f89b5f663fa62deb52332aabf201c91d30bd149b0ca5f" address="unix:///run/containerd/s/334d26f8bf7f3b00bda0e4fd85e870aa8d62f01c48231a428ae9faf725d12c35" protocol=ttrpc version=3 Mar 20 22:17:56.500515 containerd[1475]: time="2025-03-20T22:17:56.500445787Z" level=info msg="CreateContainer within sandbox \"752f68855ba837eb1cc56bce2aa1688a989fa13cbd89eedcd600f8702cd40034\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"152748c20ade808bb8aedd9d5c5c38b06678186adac5431727e421c124ae0574\"" Mar 20 22:17:56.501159 containerd[1475]: time="2025-03-20T22:17:56.501132655Z" level=info msg="StartContainer for \"152748c20ade808bb8aedd9d5c5c38b06678186adac5431727e421c124ae0574\"" Mar 20 22:17:56.501707 systemd[1]: Started cri-containerd-d3c6e08b311d5e8dbd22677f37d534abacb4a878b6e2efd2ecb8edb9c37e05a5.scope - libcontainer container d3c6e08b311d5e8dbd22677f37d534abacb4a878b6e2efd2ecb8edb9c37e05a5. Mar 20 22:17:56.504326 containerd[1475]: time="2025-03-20T22:17:56.504251655Z" level=info msg="connecting to shim 152748c20ade808bb8aedd9d5c5c38b06678186adac5431727e421c124ae0574" address="unix:///run/containerd/s/f996ba48ca627afcc8ea446efb3ec68b5f88edc8b8ec2ceab1062ba836c305ab" protocol=ttrpc version=3 Mar 20 22:17:56.522861 systemd[1]: Started cri-containerd-ec85359ee40556d3bf5f89b5f663fa62deb52332aabf201c91d30bd149b0ca5f.scope - libcontainer container ec85359ee40556d3bf5f89b5f663fa62deb52332aabf201c91d30bd149b0ca5f. Mar 20 22:17:56.535630 systemd[1]: Started cri-containerd-152748c20ade808bb8aedd9d5c5c38b06678186adac5431727e421c124ae0574.scope - libcontainer container 152748c20ade808bb8aedd9d5c5c38b06678186adac5431727e421c124ae0574. Mar 20 22:17:56.600051 containerd[1475]: time="2025-03-20T22:17:56.599281503Z" level=info msg="StartContainer for \"d3c6e08b311d5e8dbd22677f37d534abacb4a878b6e2efd2ecb8edb9c37e05a5\" returns successfully" Mar 20 22:17:56.625753 containerd[1475]: time="2025-03-20T22:17:56.623642011Z" level=info msg="StartContainer for \"ec85359ee40556d3bf5f89b5f663fa62deb52332aabf201c91d30bd149b0ca5f\" returns successfully" Mar 20 22:17:56.639502 containerd[1475]: time="2025-03-20T22:17:56.638724939Z" level=info msg="StartContainer for \"152748c20ade808bb8aedd9d5c5c38b06678186adac5431727e421c124ae0574\" returns successfully" Mar 20 22:17:57.951504 kubelet[2431]: I0320 22:17:57.950528 2431 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:58.937500 kubelet[2431]: I0320 22:17:58.937282 2431 kubelet_node_status.go:76] "Successfully registered node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:59.208800 kubelet[2431]: E0320 22:17:59.208575 2431 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:17:59.815922 kubelet[2431]: I0320 22:17:59.815870 2431 apiserver.go:52] "Watching apiserver" Mar 20 22:17:59.830865 kubelet[2431]: I0320 22:17:59.830743 2431 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 20 22:18:02.111339 systemd[1]: Reload requested from client PID 2702 ('systemctl') (unit session-11.scope)... Mar 20 22:18:02.111375 systemd[1]: Reloading... Mar 20 22:18:02.237550 zram_generator::config[2747]: No configuration found. Mar 20 22:18:02.403219 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:18:02.559709 systemd[1]: Reloading finished in 447 ms. Mar 20 22:18:02.587412 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:18:02.590696 kubelet[2431]: E0320 22:18:02.587198 2431 event.go:319] "Unable to write event (broadcaster is shut down)" event="&Event{ObjectMeta:{ci-9999-0-2-b-c50fddf147.novalocal.182ea2c5766b2950 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-2-b-c50fddf147.novalocal,UID:ci-9999-0-2-b-c50fddf147.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-0-2-b-c50fddf147.novalocal,},FirstTimestamp:2025-03-20 22:17:54.81795208 +0000 UTC m=+1.285673772,LastTimestamp:2025-03-20 22:17:54.81795208 +0000 UTC m=+1.285673772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-2-b-c50fddf147.novalocal,}" Mar 20 22:18:02.592890 kubelet[2431]: I0320 22:18:02.591506 2431 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 22:18:02.595942 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 22:18:02.596125 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:18:02.596167 systemd[1]: kubelet.service: Consumed 1.383s CPU time, 115.6M memory peak. Mar 20 22:18:02.601660 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:18:02.912777 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:18:02.930281 (kubelet)[2812]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 22:18:02.999542 kubelet[2812]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:18:03.000421 kubelet[2812]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 22:18:03.000421 kubelet[2812]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:18:03.000421 kubelet[2812]: I0320 22:18:02.999940 2812 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 22:18:03.004926 kubelet[2812]: I0320 22:18:03.004865 2812 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 20 22:18:03.004926 kubelet[2812]: I0320 22:18:03.004890 2812 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 22:18:03.005132 kubelet[2812]: I0320 22:18:03.005065 2812 server.go:927] "Client rotation is on, will bootstrap in background" Mar 20 22:18:03.006944 kubelet[2812]: I0320 22:18:03.006895 2812 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 22:18:03.008397 kubelet[2812]: I0320 22:18:03.008083 2812 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 22:18:03.016193 kubelet[2812]: I0320 22:18:03.016145 2812 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 22:18:03.016383 kubelet[2812]: I0320 22:18:03.016321 2812 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 22:18:03.019570 kubelet[2812]: I0320 22:18:03.016352 2812 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-2-b-c50fddf147.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 20 22:18:03.019570 kubelet[2812]: I0320 22:18:03.019366 2812 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 22:18:03.019570 kubelet[2812]: I0320 22:18:03.019377 2812 container_manager_linux.go:301] "Creating device plugin manager" Mar 20 22:18:03.019570 kubelet[2812]: I0320 22:18:03.019421 2812 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:18:03.020258 kubelet[2812]: I0320 22:18:03.019609 2812 kubelet.go:400] "Attempting to sync node with API server" Mar 20 22:18:03.022033 kubelet[2812]: I0320 22:18:03.020617 2812 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 22:18:03.022033 kubelet[2812]: I0320 22:18:03.020643 2812 kubelet.go:312] "Adding apiserver pod source" Mar 20 22:18:03.022033 kubelet[2812]: I0320 22:18:03.020657 2812 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 22:18:03.023544 kubelet[2812]: I0320 22:18:03.022864 2812 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 22:18:03.023544 kubelet[2812]: I0320 22:18:03.023019 2812 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 22:18:03.023544 kubelet[2812]: I0320 22:18:03.023374 2812 server.go:1264] "Started kubelet" Mar 20 22:18:03.025239 kubelet[2812]: I0320 22:18:03.025201 2812 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 22:18:03.034348 kubelet[2812]: I0320 22:18:03.033254 2812 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 22:18:03.035096 kubelet[2812]: I0320 22:18:03.035061 2812 server.go:455] "Adding debug handlers to kubelet server" Mar 20 22:18:03.035916 kubelet[2812]: I0320 22:18:03.035862 2812 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 22:18:03.036220 kubelet[2812]: I0320 22:18:03.036035 2812 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 22:18:03.042534 kubelet[2812]: I0320 22:18:03.036427 2812 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 20 22:18:03.042534 kubelet[2812]: I0320 22:18:03.037420 2812 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 20 22:18:03.042534 kubelet[2812]: I0320 22:18:03.037554 2812 reconciler.go:26] "Reconciler: start to sync state" Mar 20 22:18:03.058753 kubelet[2812]: I0320 22:18:03.058709 2812 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 22:18:03.061620 kubelet[2812]: I0320 22:18:03.061381 2812 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 22:18:03.061620 kubelet[2812]: I0320 22:18:03.061410 2812 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 22:18:03.061620 kubelet[2812]: I0320 22:18:03.061429 2812 kubelet.go:2337] "Starting kubelet main sync loop" Mar 20 22:18:03.061620 kubelet[2812]: E0320 22:18:03.061470 2812 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 22:18:03.066128 kubelet[2812]: I0320 22:18:03.065568 2812 factory.go:221] Registration of the systemd container factory successfully Mar 20 22:18:03.066128 kubelet[2812]: I0320 22:18:03.065696 2812 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 22:18:03.066309 kubelet[2812]: E0320 22:18:03.066286 2812 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 22:18:03.078899 kubelet[2812]: I0320 22:18:03.078858 2812 factory.go:221] Registration of the containerd container factory successfully Mar 20 22:18:03.132724 kubelet[2812]: I0320 22:18:03.132457 2812 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 22:18:03.132724 kubelet[2812]: I0320 22:18:03.132490 2812 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 22:18:03.132724 kubelet[2812]: I0320 22:18:03.132506 2812 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:18:03.132724 kubelet[2812]: I0320 22:18:03.132637 2812 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 20 22:18:03.132724 kubelet[2812]: I0320 22:18:03.132648 2812 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 20 22:18:03.132724 kubelet[2812]: I0320 22:18:03.132665 2812 policy_none.go:49] "None policy: Start" Mar 20 22:18:03.133561 kubelet[2812]: I0320 22:18:03.133549 2812 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 22:18:03.133671 kubelet[2812]: I0320 22:18:03.133661 2812 state_mem.go:35] "Initializing new in-memory state store" Mar 20 22:18:03.134375 kubelet[2812]: I0320 22:18:03.133820 2812 state_mem.go:75] "Updated machine memory state" Mar 20 22:18:03.139078 kubelet[2812]: I0320 22:18:03.139054 2812 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 22:18:03.139234 kubelet[2812]: I0320 22:18:03.139197 2812 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 22:18:03.139307 kubelet[2812]: I0320 22:18:03.139288 2812 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 22:18:03.139361 kubelet[2812]: I0320 22:18:03.139292 2812 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.162122 kubelet[2812]: I0320 22:18:03.162064 2812 topology_manager.go:215] "Topology Admit Handler" podUID="aef60593acf6c616e939778c1a5c8520" podNamespace="kube-system" podName="kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.162232 kubelet[2812]: I0320 22:18:03.162181 2812 topology_manager.go:215] "Topology Admit Handler" podUID="8998d043404daac6519ecba4abea29d9" podNamespace="kube-system" podName="kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.162232 kubelet[2812]: I0320 22:18:03.162216 2812 topology_manager.go:215] "Topology Admit Handler" podUID="d38092c0549ca8268027a516ffa5d736" podNamespace="kube-system" podName="kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.238372 kubelet[2812]: I0320 22:18:03.238095 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aef60593acf6c616e939778c1a5c8520-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"aef60593acf6c616e939778c1a5c8520\") " pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.238372 kubelet[2812]: I0320 22:18:03.238180 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-ca-certs\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.238372 kubelet[2812]: I0320 22:18:03.238243 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.238372 kubelet[2812]: I0320 22:18:03.238330 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.239299 kubelet[2812]: I0320 22:18:03.238972 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d38092c0549ca8268027a516ffa5d736-kubeconfig\") pod \"kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"d38092c0549ca8268027a516ffa5d736\") " pod="kube-system/kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.239299 kubelet[2812]: I0320 22:18:03.239045 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aef60593acf6c616e939778c1a5c8520-ca-certs\") pod \"kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"aef60593acf6c616e939778c1a5c8520\") " pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.239299 kubelet[2812]: I0320 22:18:03.239068 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aef60593acf6c616e939778c1a5c8520-k8s-certs\") pod \"kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"aef60593acf6c616e939778c1a5c8520\") " pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.239299 kubelet[2812]: I0320 22:18:03.239088 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.239299 kubelet[2812]: I0320 22:18:03.239117 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8998d043404daac6519ecba4abea29d9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal\" (UID: \"8998d043404daac6519ecba4abea29d9\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.512468 kubelet[2812]: W0320 22:18:03.512230 2812 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 22:18:03.515940 kubelet[2812]: W0320 22:18:03.515868 2812 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 22:18:03.518303 kubelet[2812]: W0320 22:18:03.518236 2812 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 22:18:03.531535 kubelet[2812]: I0320 22:18:03.529829 2812 kubelet_node_status.go:112] "Node was previously registered" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:03.531535 kubelet[2812]: I0320 22:18:03.530290 2812 kubelet_node_status.go:76] "Successfully registered node" node="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:04.031094 kubelet[2812]: I0320 22:18:04.030114 2812 apiserver.go:52] "Watching apiserver" Mar 20 22:18:04.137874 kubelet[2812]: I0320 22:18:04.137811 2812 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 20 22:18:04.145578 kubelet[2812]: I0320 22:18:04.144912 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-9999-0-2-b-c50fddf147.novalocal" podStartSLOduration=1.144886193 podStartE2EDuration="1.144886193s" podCreationTimestamp="2025-03-20 22:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:18:04.131106466 +0000 UTC m=+1.188288543" watchObservedRunningTime="2025-03-20 22:18:04.144886193 +0000 UTC m=+1.202068310" Mar 20 22:18:04.161423 kubelet[2812]: I0320 22:18:04.161242 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-9999-0-2-b-c50fddf147.novalocal" podStartSLOduration=1.161214015 podStartE2EDuration="1.161214015s" podCreationTimestamp="2025-03-20 22:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:18:04.145374898 +0000 UTC m=+1.202557025" watchObservedRunningTime="2025-03-20 22:18:04.161214015 +0000 UTC m=+1.218396142" Mar 20 22:18:04.182061 kubelet[2812]: I0320 22:18:04.181247 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-9999-0-2-b-c50fddf147.novalocal" podStartSLOduration=1.181216607 podStartE2EDuration="1.181216607s" podCreationTimestamp="2025-03-20 22:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:18:04.161866801 +0000 UTC m=+1.219048938" watchObservedRunningTime="2025-03-20 22:18:04.181216607 +0000 UTC m=+1.238398724" Mar 20 22:18:09.531807 sudo[1754]: pam_unix(sudo:session): session closed for user root Mar 20 22:18:09.720241 sshd[1753]: Connection closed by 172.24.4.1 port 46748 Mar 20 22:18:09.720084 sshd-session[1750]: pam_unix(sshd:session): session closed for user core Mar 20 22:18:09.732132 systemd[1]: sshd@8-172.24.4.53:22-172.24.4.1:46748.service: Deactivated successfully. Mar 20 22:18:09.739524 systemd[1]: session-11.scope: Deactivated successfully. Mar 20 22:18:09.740069 systemd[1]: session-11.scope: Consumed 7.365s CPU time, 243.6M memory peak. Mar 20 22:18:09.744369 systemd-logind[1460]: Session 11 logged out. Waiting for processes to exit. Mar 20 22:18:09.746982 systemd-logind[1460]: Removed session 11. Mar 20 22:18:16.680753 kubelet[2812]: I0320 22:18:16.680655 2812 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 20 22:18:16.681889 containerd[1475]: time="2025-03-20T22:18:16.681854915Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 20 22:18:16.682575 kubelet[2812]: I0320 22:18:16.682553 2812 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 20 22:18:17.514558 kubelet[2812]: I0320 22:18:17.512345 2812 topology_manager.go:215] "Topology Admit Handler" podUID="3df1b936-70cc-4919-98b4-e7c9f0229256" podNamespace="kube-system" podName="kube-proxy-dd6lr" Mar 20 22:18:17.537260 kubelet[2812]: I0320 22:18:17.537054 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3df1b936-70cc-4919-98b4-e7c9f0229256-xtables-lock\") pod \"kube-proxy-dd6lr\" (UID: \"3df1b936-70cc-4919-98b4-e7c9f0229256\") " pod="kube-system/kube-proxy-dd6lr" Mar 20 22:18:17.537260 kubelet[2812]: I0320 22:18:17.537135 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxpc6\" (UniqueName: \"kubernetes.io/projected/3df1b936-70cc-4919-98b4-e7c9f0229256-kube-api-access-rxpc6\") pod \"kube-proxy-dd6lr\" (UID: \"3df1b936-70cc-4919-98b4-e7c9f0229256\") " pod="kube-system/kube-proxy-dd6lr" Mar 20 22:18:17.537260 kubelet[2812]: I0320 22:18:17.537191 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3df1b936-70cc-4919-98b4-e7c9f0229256-kube-proxy\") pod \"kube-proxy-dd6lr\" (UID: \"3df1b936-70cc-4919-98b4-e7c9f0229256\") " pod="kube-system/kube-proxy-dd6lr" Mar 20 22:18:17.537260 kubelet[2812]: I0320 22:18:17.537251 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3df1b936-70cc-4919-98b4-e7c9f0229256-lib-modules\") pod \"kube-proxy-dd6lr\" (UID: \"3df1b936-70cc-4919-98b4-e7c9f0229256\") " pod="kube-system/kube-proxy-dd6lr" Mar 20 22:18:17.543376 systemd[1]: Created slice kubepods-besteffort-pod3df1b936_70cc_4919_98b4_e7c9f0229256.slice - libcontainer container kubepods-besteffort-pod3df1b936_70cc_4919_98b4_e7c9f0229256.slice. Mar 20 22:18:17.853709 containerd[1475]: time="2025-03-20T22:18:17.853665540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dd6lr,Uid:3df1b936-70cc-4919-98b4-e7c9f0229256,Namespace:kube-system,Attempt:0,}" Mar 20 22:18:17.886602 kubelet[2812]: I0320 22:18:17.884356 2812 topology_manager.go:215] "Topology Admit Handler" podUID="4304ff1c-de87-4be9-819b-d3a5325d3b79" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-zw7pz" Mar 20 22:18:17.896787 systemd[1]: Created slice kubepods-besteffort-pod4304ff1c_de87_4be9_819b_d3a5325d3b79.slice - libcontainer container kubepods-besteffort-pod4304ff1c_de87_4be9_819b_d3a5325d3b79.slice. Mar 20 22:18:17.901745 containerd[1475]: time="2025-03-20T22:18:17.901713164Z" level=info msg="connecting to shim 1b24a6892d8b0f888710ccd7a056a968faaf557f242e0ba3d5e40fadc508644c" address="unix:///run/containerd/s/b71ddca84699903cd4b659f5fbbc00fcb57c007d5e7af9816f25436ae4df7932" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:17.939056 kubelet[2812]: I0320 22:18:17.939022 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdtcm\" (UniqueName: \"kubernetes.io/projected/4304ff1c-de87-4be9-819b-d3a5325d3b79-kube-api-access-xdtcm\") pod \"tigera-operator-6479d6dc54-zw7pz\" (UID: \"4304ff1c-de87-4be9-819b-d3a5325d3b79\") " pod="tigera-operator/tigera-operator-6479d6dc54-zw7pz" Mar 20 22:18:17.939168 kubelet[2812]: I0320 22:18:17.939087 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4304ff1c-de87-4be9-819b-d3a5325d3b79-var-lib-calico\") pod \"tigera-operator-6479d6dc54-zw7pz\" (UID: \"4304ff1c-de87-4be9-819b-d3a5325d3b79\") " pod="tigera-operator/tigera-operator-6479d6dc54-zw7pz" Mar 20 22:18:17.940670 systemd[1]: Started cri-containerd-1b24a6892d8b0f888710ccd7a056a968faaf557f242e0ba3d5e40fadc508644c.scope - libcontainer container 1b24a6892d8b0f888710ccd7a056a968faaf557f242e0ba3d5e40fadc508644c. Mar 20 22:18:18.142061 containerd[1475]: time="2025-03-20T22:18:18.141349291Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dd6lr,Uid:3df1b936-70cc-4919-98b4-e7c9f0229256,Namespace:kube-system,Attempt:0,} returns sandbox id \"1b24a6892d8b0f888710ccd7a056a968faaf557f242e0ba3d5e40fadc508644c\"" Mar 20 22:18:18.152705 containerd[1475]: time="2025-03-20T22:18:18.151842218Z" level=info msg="CreateContainer within sandbox \"1b24a6892d8b0f888710ccd7a056a968faaf557f242e0ba3d5e40fadc508644c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 20 22:18:18.206689 containerd[1475]: time="2025-03-20T22:18:18.206607347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-zw7pz,Uid:4304ff1c-de87-4be9-819b-d3a5325d3b79,Namespace:tigera-operator,Attempt:0,}" Mar 20 22:18:18.235926 containerd[1475]: time="2025-03-20T22:18:18.234170046Z" level=info msg="Container 90de38026c3a9d206c607fc9029b6ed33a4c796edef346ea72b2318d30d137f6: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:18.274850 containerd[1475]: time="2025-03-20T22:18:18.273991897Z" level=info msg="CreateContainer within sandbox \"1b24a6892d8b0f888710ccd7a056a968faaf557f242e0ba3d5e40fadc508644c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"90de38026c3a9d206c607fc9029b6ed33a4c796edef346ea72b2318d30d137f6\"" Mar 20 22:18:18.275678 containerd[1475]: time="2025-03-20T22:18:18.275619461Z" level=info msg="StartContainer for \"90de38026c3a9d206c607fc9029b6ed33a4c796edef346ea72b2318d30d137f6\"" Mar 20 22:18:18.285331 containerd[1475]: time="2025-03-20T22:18:18.285252560Z" level=info msg="connecting to shim 90de38026c3a9d206c607fc9029b6ed33a4c796edef346ea72b2318d30d137f6" address="unix:///run/containerd/s/b71ddca84699903cd4b659f5fbbc00fcb57c007d5e7af9816f25436ae4df7932" protocol=ttrpc version=3 Mar 20 22:18:18.299448 containerd[1475]: time="2025-03-20T22:18:18.299192027Z" level=info msg="connecting to shim 63064524cd230d3770af2cf33c462188d012bbaaf98c81c6d4f3f4c9717811fd" address="unix:///run/containerd/s/26f9b7d376a629671d8e346fc0ca2c34689487fcbaf426eef790d1dc3070f5fa" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:18.326622 systemd[1]: Started cri-containerd-90de38026c3a9d206c607fc9029b6ed33a4c796edef346ea72b2318d30d137f6.scope - libcontainer container 90de38026c3a9d206c607fc9029b6ed33a4c796edef346ea72b2318d30d137f6. Mar 20 22:18:18.330779 systemd[1]: Started cri-containerd-63064524cd230d3770af2cf33c462188d012bbaaf98c81c6d4f3f4c9717811fd.scope - libcontainer container 63064524cd230d3770af2cf33c462188d012bbaaf98c81c6d4f3f4c9717811fd. Mar 20 22:18:18.388924 containerd[1475]: time="2025-03-20T22:18:18.388271057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-zw7pz,Uid:4304ff1c-de87-4be9-819b-d3a5325d3b79,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"63064524cd230d3770af2cf33c462188d012bbaaf98c81c6d4f3f4c9717811fd\"" Mar 20 22:18:18.390990 containerd[1475]: time="2025-03-20T22:18:18.390877725Z" level=info msg="StartContainer for \"90de38026c3a9d206c607fc9029b6ed33a4c796edef346ea72b2318d30d137f6\" returns successfully" Mar 20 22:18:18.392555 containerd[1475]: time="2025-03-20T22:18:18.392170930Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 20 22:18:23.533281 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount216594634.mount: Deactivated successfully. Mar 20 22:18:24.739322 containerd[1475]: time="2025-03-20T22:18:24.739218284Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:24.740653 containerd[1475]: time="2025-03-20T22:18:24.740600703Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 20 22:18:24.742049 containerd[1475]: time="2025-03-20T22:18:24.742005254Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:24.744886 containerd[1475]: time="2025-03-20T22:18:24.744838943Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:24.746016 containerd[1475]: time="2025-03-20T22:18:24.745724709Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 6.353527961s" Mar 20 22:18:24.746016 containerd[1475]: time="2025-03-20T22:18:24.745777398Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 20 22:18:24.748891 containerd[1475]: time="2025-03-20T22:18:24.748262873Z" level=info msg="CreateContainer within sandbox \"63064524cd230d3770af2cf33c462188d012bbaaf98c81c6d4f3f4c9717811fd\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 20 22:18:24.764303 containerd[1475]: time="2025-03-20T22:18:24.763705112Z" level=info msg="Container baff6cf52e8679ac60e5e1f781caf907ef107231586380da08f3dd8b26420ea0: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:24.767194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4187955464.mount: Deactivated successfully. Mar 20 22:18:24.774461 containerd[1475]: time="2025-03-20T22:18:24.774414271Z" level=info msg="CreateContainer within sandbox \"63064524cd230d3770af2cf33c462188d012bbaaf98c81c6d4f3f4c9717811fd\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"baff6cf52e8679ac60e5e1f781caf907ef107231586380da08f3dd8b26420ea0\"" Mar 20 22:18:24.775052 containerd[1475]: time="2025-03-20T22:18:24.774991286Z" level=info msg="StartContainer for \"baff6cf52e8679ac60e5e1f781caf907ef107231586380da08f3dd8b26420ea0\"" Mar 20 22:18:24.776042 containerd[1475]: time="2025-03-20T22:18:24.775930974Z" level=info msg="connecting to shim baff6cf52e8679ac60e5e1f781caf907ef107231586380da08f3dd8b26420ea0" address="unix:///run/containerd/s/26f9b7d376a629671d8e346fc0ca2c34689487fcbaf426eef790d1dc3070f5fa" protocol=ttrpc version=3 Mar 20 22:18:24.803611 systemd[1]: Started cri-containerd-baff6cf52e8679ac60e5e1f781caf907ef107231586380da08f3dd8b26420ea0.scope - libcontainer container baff6cf52e8679ac60e5e1f781caf907ef107231586380da08f3dd8b26420ea0. Mar 20 22:18:24.835870 containerd[1475]: time="2025-03-20T22:18:24.835830637Z" level=info msg="StartContainer for \"baff6cf52e8679ac60e5e1f781caf907ef107231586380da08f3dd8b26420ea0\" returns successfully" Mar 20 22:18:25.187883 kubelet[2812]: I0320 22:18:25.187746 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-dd6lr" podStartSLOduration=8.187711686 podStartE2EDuration="8.187711686s" podCreationTimestamp="2025-03-20 22:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:18:19.168211191 +0000 UTC m=+16.225393318" watchObservedRunningTime="2025-03-20 22:18:25.187711686 +0000 UTC m=+22.244893823" Mar 20 22:18:28.150231 kubelet[2812]: I0320 22:18:28.150168 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-zw7pz" podStartSLOduration=4.794600909 podStartE2EDuration="11.149918767s" podCreationTimestamp="2025-03-20 22:18:17 +0000 UTC" firstStartedPulling="2025-03-20 22:18:18.39150167 +0000 UTC m=+15.448683757" lastFinishedPulling="2025-03-20 22:18:24.746819538 +0000 UTC m=+21.804001615" observedRunningTime="2025-03-20 22:18:25.188364603 +0000 UTC m=+22.245546730" watchObservedRunningTime="2025-03-20 22:18:28.149918767 +0000 UTC m=+25.207100854" Mar 20 22:18:28.150689 kubelet[2812]: I0320 22:18:28.150536 2812 topology_manager.go:215] "Topology Admit Handler" podUID="88ff7e75-e8c8-4ce9-b22e-d97a76a56977" podNamespace="calico-system" podName="calico-typha-65dddd98d7-qqj6h" Mar 20 22:18:28.160486 systemd[1]: Created slice kubepods-besteffort-pod88ff7e75_e8c8_4ce9_b22e_d97a76a56977.slice - libcontainer container kubepods-besteffort-pod88ff7e75_e8c8_4ce9_b22e_d97a76a56977.slice. Mar 20 22:18:28.220428 kubelet[2812]: I0320 22:18:28.220384 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-tigera-ca-bundle\") pod \"calico-typha-65dddd98d7-qqj6h\" (UID: \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\") " pod="calico-system/calico-typha-65dddd98d7-qqj6h" Mar 20 22:18:28.220746 kubelet[2812]: I0320 22:18:28.220599 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd742\" (UniqueName: \"kubernetes.io/projected/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-kube-api-access-gd742\") pod \"calico-typha-65dddd98d7-qqj6h\" (UID: \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\") " pod="calico-system/calico-typha-65dddd98d7-qqj6h" Mar 20 22:18:28.220746 kubelet[2812]: I0320 22:18:28.220634 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-typha-certs\") pod \"calico-typha-65dddd98d7-qqj6h\" (UID: \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\") " pod="calico-system/calico-typha-65dddd98d7-qqj6h" Mar 20 22:18:28.503746 kubelet[2812]: I0320 22:18:28.502309 2812 topology_manager.go:215] "Topology Admit Handler" podUID="6642f59a-b8ee-4d5e-99d5-00c40add0c6d" podNamespace="calico-system" podName="calico-node-zgtw4" Mar 20 22:18:28.520933 systemd[1]: Created slice kubepods-besteffort-pod6642f59a_b8ee_4d5e_99d5_00c40add0c6d.slice - libcontainer container kubepods-besteffort-pod6642f59a_b8ee_4d5e_99d5_00c40add0c6d.slice. Mar 20 22:18:28.522385 kubelet[2812]: I0320 22:18:28.522342 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpw22\" (UniqueName: \"kubernetes.io/projected/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-kube-api-access-xpw22\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.522690 kubelet[2812]: I0320 22:18:28.522673 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-node-certs\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.522838 kubelet[2812]: I0320 22:18:28.522785 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-net-dir\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523108 kubelet[2812]: I0320 22:18:28.523024 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-log-dir\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523153 kubelet[2812]: I0320 22:18:28.523085 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-flexvol-driver-host\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523389 kubelet[2812]: I0320 22:18:28.523148 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-lib-calico\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523389 kubelet[2812]: I0320 22:18:28.523189 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-xtables-lock\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523389 kubelet[2812]: I0320 22:18:28.523229 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-lib-modules\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523389 kubelet[2812]: I0320 22:18:28.523294 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-tigera-ca-bundle\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523389 kubelet[2812]: I0320 22:18:28.523329 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-run-calico\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523815 kubelet[2812]: I0320 22:18:28.523355 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-bin-dir\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.523815 kubelet[2812]: I0320 22:18:28.523749 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-policysync\") pod \"calico-node-zgtw4\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " pod="calico-system/calico-node-zgtw4" Mar 20 22:18:28.632599 kubelet[2812]: E0320 22:18:28.632527 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.632599 kubelet[2812]: W0320 22:18:28.632549 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.632599 kubelet[2812]: E0320 22:18:28.632567 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.725947 kubelet[2812]: E0320 22:18:28.725884 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.725947 kubelet[2812]: W0320 22:18:28.725902 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.725947 kubelet[2812]: E0320 22:18:28.725917 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.766651 containerd[1475]: time="2025-03-20T22:18:28.766290333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65dddd98d7-qqj6h,Uid:88ff7e75-e8c8-4ce9-b22e-d97a76a56977,Namespace:calico-system,Attempt:0,}" Mar 20 22:18:28.829529 kubelet[2812]: E0320 22:18:28.828083 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.829529 kubelet[2812]: W0320 22:18:28.828122 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.829529 kubelet[2812]: E0320 22:18:28.828154 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.834710 kubelet[2812]: E0320 22:18:28.834653 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.834710 kubelet[2812]: W0320 22:18:28.834674 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.834710 kubelet[2812]: E0320 22:18:28.834692 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.850305 kubelet[2812]: I0320 22:18:28.850262 2812 topology_manager.go:215] "Topology Admit Handler" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" podNamespace="calico-system" podName="csi-node-driver-47bzn" Mar 20 22:18:28.850596 kubelet[2812]: E0320 22:18:28.850559 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:28.868399 containerd[1475]: time="2025-03-20T22:18:28.866354223Z" level=info msg="connecting to shim 870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1" address="unix:///run/containerd/s/2b92e40e6212986fad4f8e278997d74915996599426f5ac0a103d7da6e76975a" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:28.904715 systemd[1]: Started cri-containerd-870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1.scope - libcontainer container 870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1. Mar 20 22:18:28.916753 kubelet[2812]: E0320 22:18:28.916707 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.916753 kubelet[2812]: W0320 22:18:28.916743 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.917027 kubelet[2812]: E0320 22:18:28.916769 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.917058 kubelet[2812]: E0320 22:18:28.917028 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.917058 kubelet[2812]: W0320 22:18:28.917041 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.917117 kubelet[2812]: E0320 22:18:28.917054 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.917571 kubelet[2812]: E0320 22:18:28.917549 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.917571 kubelet[2812]: W0320 22:18:28.917566 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.918357 kubelet[2812]: E0320 22:18:28.917580 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.918713 kubelet[2812]: E0320 22:18:28.918676 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.918713 kubelet[2812]: W0320 22:18:28.918693 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.918713 kubelet[2812]: E0320 22:18:28.918708 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.919153 kubelet[2812]: E0320 22:18:28.919130 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.919223 kubelet[2812]: W0320 22:18:28.919179 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.919223 kubelet[2812]: E0320 22:18:28.919199 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.919532 kubelet[2812]: E0320 22:18:28.919514 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.919532 kubelet[2812]: W0320 22:18:28.919527 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.919630 kubelet[2812]: E0320 22:18:28.919538 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.919981 kubelet[2812]: E0320 22:18:28.919829 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.919981 kubelet[2812]: W0320 22:18:28.919844 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.919981 kubelet[2812]: E0320 22:18:28.919882 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.920945 kubelet[2812]: E0320 22:18:28.920273 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.920945 kubelet[2812]: W0320 22:18:28.920285 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.920945 kubelet[2812]: E0320 22:18:28.920297 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.921253 kubelet[2812]: E0320 22:18:28.921201 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.921253 kubelet[2812]: W0320 22:18:28.921215 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.921437 kubelet[2812]: E0320 22:18:28.921230 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.921627 kubelet[2812]: E0320 22:18:28.921606 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.921940 kubelet[2812]: W0320 22:18:28.921748 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.921940 kubelet[2812]: E0320 22:18:28.921765 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.922238 kubelet[2812]: E0320 22:18:28.922188 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.922511 kubelet[2812]: W0320 22:18:28.922419 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.922511 kubelet[2812]: E0320 22:18:28.922435 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.923032 kubelet[2812]: E0320 22:18:28.922958 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.923323 kubelet[2812]: W0320 22:18:28.922971 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.923323 kubelet[2812]: E0320 22:18:28.923108 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.923705 kubelet[2812]: E0320 22:18:28.923535 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.923705 kubelet[2812]: W0320 22:18:28.923547 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.923705 kubelet[2812]: E0320 22:18:28.923557 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.924226 kubelet[2812]: E0320 22:18:28.924037 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.924226 kubelet[2812]: W0320 22:18:28.924050 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.924226 kubelet[2812]: E0320 22:18:28.924060 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.924777 kubelet[2812]: E0320 22:18:28.924570 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.924777 kubelet[2812]: W0320 22:18:28.924583 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.924777 kubelet[2812]: E0320 22:18:28.924594 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.926039 kubelet[2812]: E0320 22:18:28.925902 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.926039 kubelet[2812]: W0320 22:18:28.925917 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.926039 kubelet[2812]: E0320 22:18:28.925931 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.926231 kubelet[2812]: E0320 22:18:28.926219 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.926303 kubelet[2812]: W0320 22:18:28.926290 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.926380 kubelet[2812]: E0320 22:18:28.926367 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.926793 kubelet[2812]: E0320 22:18:28.926634 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.926793 kubelet[2812]: W0320 22:18:28.926646 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.926793 kubelet[2812]: E0320 22:18:28.926656 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.927244 kubelet[2812]: E0320 22:18:28.927170 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.927437 kubelet[2812]: W0320 22:18:28.927345 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.927437 kubelet[2812]: E0320 22:18:28.927363 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.928176 kubelet[2812]: E0320 22:18:28.927721 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.928176 kubelet[2812]: W0320 22:18:28.927732 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.928176 kubelet[2812]: E0320 22:18:28.928109 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.928817 kubelet[2812]: E0320 22:18:28.928800 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.928817 kubelet[2812]: W0320 22:18:28.928814 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.928895 kubelet[2812]: E0320 22:18:28.928827 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.928895 kubelet[2812]: I0320 22:18:28.928858 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz4wc\" (UniqueName: \"kubernetes.io/projected/3068cdad-5d4c-43c1-adba-247b734e4e53-kube-api-access-dz4wc\") pod \"csi-node-driver-47bzn\" (UID: \"3068cdad-5d4c-43c1-adba-247b734e4e53\") " pod="calico-system/csi-node-driver-47bzn" Mar 20 22:18:28.929518 kubelet[2812]: E0320 22:18:28.929491 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.929518 kubelet[2812]: W0320 22:18:28.929509 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.929600 kubelet[2812]: E0320 22:18:28.929526 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.929600 kubelet[2812]: I0320 22:18:28.929543 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3068cdad-5d4c-43c1-adba-247b734e4e53-kubelet-dir\") pod \"csi-node-driver-47bzn\" (UID: \"3068cdad-5d4c-43c1-adba-247b734e4e53\") " pod="calico-system/csi-node-driver-47bzn" Mar 20 22:18:28.929706 kubelet[2812]: E0320 22:18:28.929690 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.929706 kubelet[2812]: W0320 22:18:28.929703 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.929781 kubelet[2812]: E0320 22:18:28.929714 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.929781 kubelet[2812]: I0320 22:18:28.929731 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3068cdad-5d4c-43c1-adba-247b734e4e53-varrun\") pod \"csi-node-driver-47bzn\" (UID: \"3068cdad-5d4c-43c1-adba-247b734e4e53\") " pod="calico-system/csi-node-driver-47bzn" Mar 20 22:18:28.929909 kubelet[2812]: E0320 22:18:28.929891 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.929909 kubelet[2812]: W0320 22:18:28.929906 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.929986 kubelet[2812]: E0320 22:18:28.929917 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.929986 kubelet[2812]: I0320 22:18:28.929934 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3068cdad-5d4c-43c1-adba-247b734e4e53-registration-dir\") pod \"csi-node-driver-47bzn\" (UID: \"3068cdad-5d4c-43c1-adba-247b734e4e53\") " pod="calico-system/csi-node-driver-47bzn" Mar 20 22:18:28.930094 kubelet[2812]: E0320 22:18:28.930078 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.930094 kubelet[2812]: W0320 22:18:28.930092 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.930176 kubelet[2812]: E0320 22:18:28.930103 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.930176 kubelet[2812]: I0320 22:18:28.930119 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3068cdad-5d4c-43c1-adba-247b734e4e53-socket-dir\") pod \"csi-node-driver-47bzn\" (UID: \"3068cdad-5d4c-43c1-adba-247b734e4e53\") " pod="calico-system/csi-node-driver-47bzn" Mar 20 22:18:28.930449 kubelet[2812]: E0320 22:18:28.930287 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.930449 kubelet[2812]: W0320 22:18:28.930301 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.930449 kubelet[2812]: E0320 22:18:28.930310 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.930449 kubelet[2812]: E0320 22:18:28.930447 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.931041 kubelet[2812]: W0320 22:18:28.930456 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.931041 kubelet[2812]: E0320 22:18:28.930609 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.931041 kubelet[2812]: E0320 22:18:28.930740 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.931041 kubelet[2812]: W0320 22:18:28.930751 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.931041 kubelet[2812]: E0320 22:18:28.930848 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.931041 kubelet[2812]: E0320 22:18:28.931012 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.931041 kubelet[2812]: W0320 22:18:28.931024 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.931041 kubelet[2812]: E0320 22:18:28.931041 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.931582 kubelet[2812]: E0320 22:18:28.931235 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.931582 kubelet[2812]: W0320 22:18:28.931246 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.931582 kubelet[2812]: E0320 22:18:28.931284 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.931582 kubelet[2812]: E0320 22:18:28.931500 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.931582 kubelet[2812]: W0320 22:18:28.931512 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.931582 kubelet[2812]: E0320 22:18:28.931535 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.932202 kubelet[2812]: E0320 22:18:28.931727 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.932202 kubelet[2812]: W0320 22:18:28.931738 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.932202 kubelet[2812]: E0320 22:18:28.931749 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.932202 kubelet[2812]: E0320 22:18:28.931969 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.932202 kubelet[2812]: W0320 22:18:28.931978 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.932202 kubelet[2812]: E0320 22:18:28.931990 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.932202 kubelet[2812]: E0320 22:18:28.932140 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.932202 kubelet[2812]: W0320 22:18:28.932151 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.932202 kubelet[2812]: E0320 22:18:28.932160 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:28.932730 kubelet[2812]: E0320 22:18:28.932342 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:28.932730 kubelet[2812]: W0320 22:18:28.932351 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:28.932730 kubelet[2812]: E0320 22:18:28.932360 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.018306 containerd[1475]: time="2025-03-20T22:18:29.017704737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65dddd98d7-qqj6h,Uid:88ff7e75-e8c8-4ce9-b22e-d97a76a56977,Namespace:calico-system,Attempt:0,} returns sandbox id \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\"" Mar 20 22:18:29.022329 containerd[1475]: time="2025-03-20T22:18:29.022053659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 20 22:18:29.031071 kubelet[2812]: E0320 22:18:29.031039 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.031071 kubelet[2812]: W0320 22:18:29.031058 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.031304 kubelet[2812]: E0320 22:18:29.031078 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.031304 kubelet[2812]: E0320 22:18:29.031296 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.031386 kubelet[2812]: W0320 22:18:29.031306 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.031386 kubelet[2812]: E0320 22:18:29.031321 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.031610 kubelet[2812]: E0320 22:18:29.031594 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.031610 kubelet[2812]: W0320 22:18:29.031609 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.031719 kubelet[2812]: E0320 22:18:29.031619 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.032448 kubelet[2812]: E0320 22:18:29.032426 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.032448 kubelet[2812]: W0320 22:18:29.032440 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.032697 kubelet[2812]: E0320 22:18:29.032462 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.032697 kubelet[2812]: E0320 22:18:29.032689 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.032781 kubelet[2812]: W0320 22:18:29.032698 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.032781 kubelet[2812]: E0320 22:18:29.032715 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.032985 kubelet[2812]: E0320 22:18:29.032963 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.032985 kubelet[2812]: W0320 22:18:29.032977 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.033189 kubelet[2812]: E0320 22:18:29.033164 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.033646 kubelet[2812]: E0320 22:18:29.033405 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.033646 kubelet[2812]: W0320 22:18:29.033420 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.033646 kubelet[2812]: E0320 22:18:29.033458 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.034126 kubelet[2812]: E0320 22:18:29.034108 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.034126 kubelet[2812]: W0320 22:18:29.034122 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.034258 kubelet[2812]: E0320 22:18:29.034229 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.034506 kubelet[2812]: E0320 22:18:29.034409 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.034506 kubelet[2812]: W0320 22:18:29.034421 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.034703 kubelet[2812]: E0320 22:18:29.034603 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.034750 kubelet[2812]: E0320 22:18:29.034703 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.034750 kubelet[2812]: W0320 22:18:29.034713 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.034807 kubelet[2812]: E0320 22:18:29.034762 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.035023 kubelet[2812]: E0320 22:18:29.034963 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.035023 kubelet[2812]: W0320 22:18:29.034974 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.035146 kubelet[2812]: E0320 22:18:29.035082 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.035253 kubelet[2812]: E0320 22:18:29.035206 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.035253 kubelet[2812]: W0320 22:18:29.035248 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.035572 kubelet[2812]: E0320 22:18:29.035459 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.035572 kubelet[2812]: W0320 22:18:29.035522 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.035572 kubelet[2812]: E0320 22:18:29.035534 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.035572 kubelet[2812]: E0320 22:18:29.035555 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.035729 kubelet[2812]: E0320 22:18:29.035707 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.035729 kubelet[2812]: W0320 22:18:29.035717 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.035787 kubelet[2812]: E0320 22:18:29.035751 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.035976 kubelet[2812]: E0320 22:18:29.035958 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.036022 kubelet[2812]: W0320 22:18:29.035988 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.036151 kubelet[2812]: E0320 22:18:29.036099 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.036151 kubelet[2812]: E0320 22:18:29.036250 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.036151 kubelet[2812]: W0320 22:18:29.036260 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.036151 kubelet[2812]: E0320 22:18:29.036349 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.036683 kubelet[2812]: E0320 22:18:29.036557 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.036683 kubelet[2812]: W0320 22:18:29.036568 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.037030 kubelet[2812]: E0320 22:18:29.036825 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.037234 kubelet[2812]: E0320 22:18:29.037218 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.037234 kubelet[2812]: W0320 22:18:29.037232 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.037391 kubelet[2812]: E0320 22:18:29.037314 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.037431 kubelet[2812]: E0320 22:18:29.037396 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.037431 kubelet[2812]: W0320 22:18:29.037404 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.037575 kubelet[2812]: E0320 22:18:29.037511 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.037738 kubelet[2812]: E0320 22:18:29.037722 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.037738 kubelet[2812]: W0320 22:18:29.037735 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.037861 kubelet[2812]: E0320 22:18:29.037848 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.038009 kubelet[2812]: E0320 22:18:29.037990 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.038055 kubelet[2812]: W0320 22:18:29.038028 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.038249 kubelet[2812]: E0320 22:18:29.038094 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.038371 kubelet[2812]: E0320 22:18:29.038354 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.038371 kubelet[2812]: W0320 22:18:29.038368 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.038440 kubelet[2812]: E0320 22:18:29.038382 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.038651 kubelet[2812]: E0320 22:18:29.038628 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.038651 kubelet[2812]: W0320 22:18:29.038642 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.038651 kubelet[2812]: E0320 22:18:29.038656 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.039059 kubelet[2812]: E0320 22:18:29.038823 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.039059 kubelet[2812]: W0320 22:18:29.038836 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.039059 kubelet[2812]: E0320 22:18:29.038850 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.039493 kubelet[2812]: E0320 22:18:29.039210 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.039635 kubelet[2812]: W0320 22:18:29.039575 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.039635 kubelet[2812]: E0320 22:18:29.039609 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.052737 kubelet[2812]: E0320 22:18:29.052662 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:29.052737 kubelet[2812]: W0320 22:18:29.052683 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:29.052737 kubelet[2812]: E0320 22:18:29.052701 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:29.126992 containerd[1475]: time="2025-03-20T22:18:29.126933514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zgtw4,Uid:6642f59a-b8ee-4d5e-99d5-00c40add0c6d,Namespace:calico-system,Attempt:0,}" Mar 20 22:18:29.160627 containerd[1475]: time="2025-03-20T22:18:29.160583488Z" level=info msg="connecting to shim 86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080" address="unix:///run/containerd/s/3e2209f4b12f79abe6965f00946e6dd89d1eb5e9f2b304235f25613b7f681bde" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:29.186649 systemd[1]: Started cri-containerd-86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080.scope - libcontainer container 86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080. Mar 20 22:18:29.220961 containerd[1475]: time="2025-03-20T22:18:29.220828848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zgtw4,Uid:6642f59a-b8ee-4d5e-99d5-00c40add0c6d,Namespace:calico-system,Attempt:0,} returns sandbox id \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\"" Mar 20 22:18:31.062181 kubelet[2812]: E0320 22:18:31.062126 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:31.848500 containerd[1475]: time="2025-03-20T22:18:31.848446328Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:31.850097 containerd[1475]: time="2025-03-20T22:18:31.850027969Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 20 22:18:31.851786 containerd[1475]: time="2025-03-20T22:18:31.851732151Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:31.854009 containerd[1475]: time="2025-03-20T22:18:31.853987288Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:31.854817 containerd[1475]: time="2025-03-20T22:18:31.854662536Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 2.832563712s" Mar 20 22:18:31.854817 containerd[1475]: time="2025-03-20T22:18:31.854701209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 20 22:18:31.857504 containerd[1475]: time="2025-03-20T22:18:31.855846160Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 20 22:18:31.868504 containerd[1475]: time="2025-03-20T22:18:31.868380228Z" level=info msg="CreateContainer within sandbox \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 22:18:31.884546 containerd[1475]: time="2025-03-20T22:18:31.880421520Z" level=info msg="Container 573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:31.901317 containerd[1475]: time="2025-03-20T22:18:31.901273124Z" level=info msg="CreateContainer within sandbox \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\"" Mar 20 22:18:31.902357 containerd[1475]: time="2025-03-20T22:18:31.901847143Z" level=info msg="StartContainer for \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\"" Mar 20 22:18:31.903007 containerd[1475]: time="2025-03-20T22:18:31.902965894Z" level=info msg="connecting to shim 573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b" address="unix:///run/containerd/s/2b92e40e6212986fad4f8e278997d74915996599426f5ac0a103d7da6e76975a" protocol=ttrpc version=3 Mar 20 22:18:31.933701 systemd[1]: Started cri-containerd-573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b.scope - libcontainer container 573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b. Mar 20 22:18:31.995610 containerd[1475]: time="2025-03-20T22:18:31.995449685Z" level=info msg="StartContainer for \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" returns successfully" Mar 20 22:18:32.256179 kubelet[2812]: E0320 22:18:32.255945 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.256179 kubelet[2812]: W0320 22:18:32.255969 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.256179 kubelet[2812]: E0320 22:18:32.255987 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.257012 kubelet[2812]: E0320 22:18:32.256220 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.257012 kubelet[2812]: W0320 22:18:32.256231 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.257012 kubelet[2812]: E0320 22:18:32.256242 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.257247 kubelet[2812]: E0320 22:18:32.257232 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.257247 kubelet[2812]: W0320 22:18:32.257245 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.257456 kubelet[2812]: E0320 22:18:32.257257 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.257538 kubelet[2812]: E0320 22:18:32.257450 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.257538 kubelet[2812]: W0320 22:18:32.257466 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.257538 kubelet[2812]: E0320 22:18:32.257494 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.257724 kubelet[2812]: E0320 22:18:32.257709 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.257724 kubelet[2812]: W0320 22:18:32.257722 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.257987 kubelet[2812]: E0320 22:18:32.257733 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.257987 kubelet[2812]: E0320 22:18:32.257966 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.257987 kubelet[2812]: W0320 22:18:32.257975 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.257987 kubelet[2812]: E0320 22:18:32.257984 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.258154 kubelet[2812]: E0320 22:18:32.258140 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.258154 kubelet[2812]: W0320 22:18:32.258152 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.258263 kubelet[2812]: E0320 22:18:32.258163 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.258354 kubelet[2812]: E0320 22:18:32.258339 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.258354 kubelet[2812]: W0320 22:18:32.258353 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.258554 kubelet[2812]: E0320 22:18:32.258363 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.258696 kubelet[2812]: E0320 22:18:32.258670 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.258696 kubelet[2812]: W0320 22:18:32.258683 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.258696 kubelet[2812]: E0320 22:18:32.258692 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.258912 kubelet[2812]: E0320 22:18:32.258874 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.258912 kubelet[2812]: W0320 22:18:32.258883 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.258912 kubelet[2812]: E0320 22:18:32.258891 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.259069 kubelet[2812]: E0320 22:18:32.259049 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.259069 kubelet[2812]: W0320 22:18:32.259062 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.259131 kubelet[2812]: E0320 22:18:32.259071 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.259271 kubelet[2812]: E0320 22:18:32.259252 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.259271 kubelet[2812]: W0320 22:18:32.259265 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.259381 kubelet[2812]: E0320 22:18:32.259275 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.259491 kubelet[2812]: E0320 22:18:32.259459 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.259537 kubelet[2812]: W0320 22:18:32.259494 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.259537 kubelet[2812]: E0320 22:18:32.259506 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.259724 kubelet[2812]: E0320 22:18:32.259708 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.259724 kubelet[2812]: W0320 22:18:32.259717 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.259781 kubelet[2812]: E0320 22:18:32.259726 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.259895 kubelet[2812]: E0320 22:18:32.259880 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.259895 kubelet[2812]: W0320 22:18:32.259892 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.259957 kubelet[2812]: E0320 22:18:32.259903 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.357091 kubelet[2812]: E0320 22:18:32.357040 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.357091 kubelet[2812]: W0320 22:18:32.357086 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.357398 kubelet[2812]: E0320 22:18:32.357124 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.357707 kubelet[2812]: E0320 22:18:32.357680 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.357786 kubelet[2812]: W0320 22:18:32.357712 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.357786 kubelet[2812]: E0320 22:18:32.357749 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.358222 kubelet[2812]: E0320 22:18:32.358195 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.358277 kubelet[2812]: W0320 22:18:32.358223 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.358277 kubelet[2812]: E0320 22:18:32.358257 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.358690 kubelet[2812]: E0320 22:18:32.358656 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.358835 kubelet[2812]: W0320 22:18:32.358761 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.358835 kubelet[2812]: E0320 22:18:32.358794 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.359199 kubelet[2812]: E0320 22:18:32.359102 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.359199 kubelet[2812]: W0320 22:18:32.359114 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.359199 kubelet[2812]: E0320 22:18:32.359129 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.359566 kubelet[2812]: E0320 22:18:32.359416 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.359566 kubelet[2812]: W0320 22:18:32.359428 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.359566 kubelet[2812]: E0320 22:18:32.359440 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.359751 kubelet[2812]: E0320 22:18:32.359739 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.359875 kubelet[2812]: W0320 22:18:32.359825 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.360018 kubelet[2812]: E0320 22:18:32.359898 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.360255 kubelet[2812]: E0320 22:18:32.360187 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.360255 kubelet[2812]: W0320 22:18:32.360199 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.360579 kubelet[2812]: E0320 22:18:32.360461 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.360579 kubelet[2812]: W0320 22:18:32.360501 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.360579 kubelet[2812]: E0320 22:18:32.360514 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.360878 kubelet[2812]: E0320 22:18:32.360741 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.360878 kubelet[2812]: E0320 22:18:32.360831 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.360878 kubelet[2812]: W0320 22:18:32.360842 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.361095 kubelet[2812]: E0320 22:18:32.361071 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.361275 kubelet[2812]: E0320 22:18:32.361263 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.361410 kubelet[2812]: W0320 22:18:32.361336 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.361410 kubelet[2812]: E0320 22:18:32.361358 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.361902 kubelet[2812]: E0320 22:18:32.361772 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.361902 kubelet[2812]: W0320 22:18:32.361795 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.361902 kubelet[2812]: E0320 22:18:32.361811 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.362220 kubelet[2812]: E0320 22:18:32.362085 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.362220 kubelet[2812]: W0320 22:18:32.362097 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.362220 kubelet[2812]: E0320 22:18:32.362111 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.362426 kubelet[2812]: E0320 22:18:32.362390 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.362593 kubelet[2812]: W0320 22:18:32.362510 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.362643 kubelet[2812]: E0320 22:18:32.362584 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.362901 kubelet[2812]: E0320 22:18:32.362797 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.362901 kubelet[2812]: W0320 22:18:32.362808 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.362901 kubelet[2812]: E0320 22:18:32.362818 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.363246 kubelet[2812]: E0320 22:18:32.363149 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.363246 kubelet[2812]: W0320 22:18:32.363161 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.363246 kubelet[2812]: E0320 22:18:32.363186 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.363776 kubelet[2812]: E0320 22:18:32.363696 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.363776 kubelet[2812]: W0320 22:18:32.363710 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.363776 kubelet[2812]: E0320 22:18:32.363729 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.364293 kubelet[2812]: E0320 22:18:32.364281 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:32.364379 kubelet[2812]: W0320 22:18:32.364345 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:32.364379 kubelet[2812]: E0320 22:18:32.364360 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:32.914789 kubelet[2812]: I0320 22:18:32.913089 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-65dddd98d7-qqj6h" podStartSLOduration=2.077176175 podStartE2EDuration="4.913053985s" podCreationTimestamp="2025-03-20 22:18:28 +0000 UTC" firstStartedPulling="2025-03-20 22:18:29.019756582 +0000 UTC m=+26.076938659" lastFinishedPulling="2025-03-20 22:18:31.855634382 +0000 UTC m=+28.912816469" observedRunningTime="2025-03-20 22:18:32.217073007 +0000 UTC m=+29.274255094" watchObservedRunningTime="2025-03-20 22:18:32.913053985 +0000 UTC m=+29.970236112" Mar 20 22:18:33.063026 kubelet[2812]: E0320 22:18:33.062895 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:33.267335 kubelet[2812]: E0320 22:18:33.267185 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.268896 kubelet[2812]: W0320 22:18:33.268185 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.268896 kubelet[2812]: E0320 22:18:33.268263 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.269042 kubelet[2812]: E0320 22:18:33.268962 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.269042 kubelet[2812]: W0320 22:18:33.268997 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.269239 kubelet[2812]: E0320 22:18:33.269090 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.269944 kubelet[2812]: E0320 22:18:33.269864 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.270939 kubelet[2812]: W0320 22:18:33.270608 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.270939 kubelet[2812]: E0320 22:18:33.270648 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.271614 kubelet[2812]: E0320 22:18:33.271331 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.271614 kubelet[2812]: W0320 22:18:33.271358 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.271614 kubelet[2812]: E0320 22:18:33.271382 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.272315 kubelet[2812]: E0320 22:18:33.272060 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.272315 kubelet[2812]: W0320 22:18:33.272090 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.272315 kubelet[2812]: E0320 22:18:33.272113 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.273018 kubelet[2812]: E0320 22:18:33.272646 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.273018 kubelet[2812]: W0320 22:18:33.272668 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.273018 kubelet[2812]: E0320 22:18:33.272691 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.274065 kubelet[2812]: E0320 22:18:33.273626 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.274065 kubelet[2812]: W0320 22:18:33.273653 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.274065 kubelet[2812]: E0320 22:18:33.273681 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.275698 kubelet[2812]: E0320 22:18:33.275413 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.275698 kubelet[2812]: W0320 22:18:33.275443 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.275698 kubelet[2812]: E0320 22:18:33.275467 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.276334 kubelet[2812]: E0320 22:18:33.276135 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.276334 kubelet[2812]: W0320 22:18:33.276162 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.276334 kubelet[2812]: E0320 22:18:33.276186 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.277057 kubelet[2812]: E0320 22:18:33.276855 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.277057 kubelet[2812]: W0320 22:18:33.276883 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.277057 kubelet[2812]: E0320 22:18:33.276906 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.277576 kubelet[2812]: E0320 22:18:33.277548 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.277905 kubelet[2812]: W0320 22:18:33.277873 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.278064 kubelet[2812]: E0320 22:18:33.278039 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.278701 kubelet[2812]: E0320 22:18:33.278541 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.278701 kubelet[2812]: W0320 22:18:33.278568 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.278701 kubelet[2812]: E0320 22:18:33.278587 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.281081 kubelet[2812]: E0320 22:18:33.278943 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.281081 kubelet[2812]: W0320 22:18:33.278961 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.281081 kubelet[2812]: E0320 22:18:33.278978 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.281081 kubelet[2812]: E0320 22:18:33.279217 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.281081 kubelet[2812]: W0320 22:18:33.279234 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.281081 kubelet[2812]: E0320 22:18:33.279250 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.281081 kubelet[2812]: E0320 22:18:33.279579 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.281081 kubelet[2812]: W0320 22:18:33.279595 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.281081 kubelet[2812]: E0320 22:18:33.279611 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.367122 kubelet[2812]: E0320 22:18:33.367026 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.367122 kubelet[2812]: W0320 22:18:33.367086 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.367887 kubelet[2812]: E0320 22:18:33.367133 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.368843 kubelet[2812]: E0320 22:18:33.368334 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.368843 kubelet[2812]: W0320 22:18:33.368378 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.368843 kubelet[2812]: E0320 22:18:33.368411 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.369259 kubelet[2812]: E0320 22:18:33.369056 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.369259 kubelet[2812]: W0320 22:18:33.369086 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.369259 kubelet[2812]: E0320 22:18:33.369117 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.370672 kubelet[2812]: E0320 22:18:33.369728 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.370672 kubelet[2812]: W0320 22:18:33.369773 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.370672 kubelet[2812]: E0320 22:18:33.369811 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.370672 kubelet[2812]: E0320 22:18:33.370367 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.370672 kubelet[2812]: W0320 22:18:33.370396 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.370672 kubelet[2812]: E0320 22:18:33.370427 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.371455 kubelet[2812]: E0320 22:18:33.371062 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.371455 kubelet[2812]: W0320 22:18:33.371091 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.371455 kubelet[2812]: E0320 22:18:33.371121 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.373656 kubelet[2812]: E0320 22:18:33.371895 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.373656 kubelet[2812]: W0320 22:18:33.371925 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.373656 kubelet[2812]: E0320 22:18:33.371956 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.373656 kubelet[2812]: E0320 22:18:33.373119 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.373656 kubelet[2812]: W0320 22:18:33.373145 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.373656 kubelet[2812]: E0320 22:18:33.373174 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.374442 kubelet[2812]: E0320 22:18:33.373667 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.374442 kubelet[2812]: W0320 22:18:33.373695 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.374442 kubelet[2812]: E0320 22:18:33.373727 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.374442 kubelet[2812]: E0320 22:18:33.374181 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.374442 kubelet[2812]: W0320 22:18:33.374207 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.374442 kubelet[2812]: E0320 22:18:33.374237 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.378174 kubelet[2812]: E0320 22:18:33.374719 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.378174 kubelet[2812]: W0320 22:18:33.374748 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.378174 kubelet[2812]: E0320 22:18:33.374778 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.378174 kubelet[2812]: E0320 22:18:33.375312 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.378174 kubelet[2812]: W0320 22:18:33.375340 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.378174 kubelet[2812]: E0320 22:18:33.375373 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.378174 kubelet[2812]: E0320 22:18:33.377194 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.378174 kubelet[2812]: W0320 22:18:33.377228 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.378174 kubelet[2812]: E0320 22:18:33.377281 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.382602 kubelet[2812]: E0320 22:18:33.378643 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.382602 kubelet[2812]: W0320 22:18:33.378673 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.382602 kubelet[2812]: E0320 22:18:33.378705 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.382602 kubelet[2812]: E0320 22:18:33.380083 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.382602 kubelet[2812]: W0320 22:18:33.380145 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.382602 kubelet[2812]: E0320 22:18:33.380176 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.382602 kubelet[2812]: E0320 22:18:33.381070 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.382602 kubelet[2812]: W0320 22:18:33.381099 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.382602 kubelet[2812]: E0320 22:18:33.381132 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.382602 kubelet[2812]: E0320 22:18:33.382310 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.384283 kubelet[2812]: W0320 22:18:33.382339 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.384283 kubelet[2812]: E0320 22:18:33.382372 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.384283 kubelet[2812]: E0320 22:18:33.383205 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:18:33.384283 kubelet[2812]: W0320 22:18:33.383235 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:18:33.384283 kubelet[2812]: E0320 22:18:33.383267 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:18:33.735023 containerd[1475]: time="2025-03-20T22:18:33.734984993Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:33.736213 containerd[1475]: time="2025-03-20T22:18:33.736150943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 20 22:18:33.737318 containerd[1475]: time="2025-03-20T22:18:33.737283430Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:33.741234 containerd[1475]: time="2025-03-20T22:18:33.740241947Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:33.741234 containerd[1475]: time="2025-03-20T22:18:33.740959345Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.884327449s" Mar 20 22:18:33.741234 containerd[1475]: time="2025-03-20T22:18:33.740987598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 20 22:18:33.744204 containerd[1475]: time="2025-03-20T22:18:33.744173452Z" level=info msg="CreateContainer within sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 22:18:33.756731 containerd[1475]: time="2025-03-20T22:18:33.755680065Z" level=info msg="Container a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:33.759983 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2921328074.mount: Deactivated successfully. Mar 20 22:18:33.773199 containerd[1475]: time="2025-03-20T22:18:33.773136734Z" level=info msg="CreateContainer within sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\"" Mar 20 22:18:33.775530 containerd[1475]: time="2025-03-20T22:18:33.774097869Z" level=info msg="StartContainer for \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\"" Mar 20 22:18:33.776083 containerd[1475]: time="2025-03-20T22:18:33.776057320Z" level=info msg="connecting to shim a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1" address="unix:///run/containerd/s/3e2209f4b12f79abe6965f00946e6dd89d1eb5e9f2b304235f25613b7f681bde" protocol=ttrpc version=3 Mar 20 22:18:33.807748 systemd[1]: Started cri-containerd-a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1.scope - libcontainer container a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1. Mar 20 22:18:33.868528 containerd[1475]: time="2025-03-20T22:18:33.868495097Z" level=info msg="StartContainer for \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" returns successfully" Mar 20 22:18:33.871561 systemd[1]: cri-containerd-a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1.scope: Deactivated successfully. Mar 20 22:18:33.874678 containerd[1475]: time="2025-03-20T22:18:33.874315661Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" id:\"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" pid:3498 exited_at:{seconds:1742509113 nanos:873746952}" Mar 20 22:18:33.874678 containerd[1475]: time="2025-03-20T22:18:33.874376545Z" level=info msg="received exit event container_id:\"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" id:\"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" pid:3498 exited_at:{seconds:1742509113 nanos:873746952}" Mar 20 22:18:33.903798 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1-rootfs.mount: Deactivated successfully. Mar 20 22:18:35.062983 kubelet[2812]: E0320 22:18:35.062327 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:35.233644 containerd[1475]: time="2025-03-20T22:18:35.232811039Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 20 22:18:37.064306 kubelet[2812]: E0320 22:18:37.063260 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:39.062282 kubelet[2812]: E0320 22:18:39.062209 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:41.031337 containerd[1475]: time="2025-03-20T22:18:41.031073004Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:41.033062 containerd[1475]: time="2025-03-20T22:18:41.032795628Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 20 22:18:41.033062 containerd[1475]: time="2025-03-20T22:18:41.032951981Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:41.036446 containerd[1475]: time="2025-03-20T22:18:41.036379876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:41.038644 containerd[1475]: time="2025-03-20T22:18:41.038323634Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 5.805427696s" Mar 20 22:18:41.038644 containerd[1475]: time="2025-03-20T22:18:41.038402453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 20 22:18:41.042575 containerd[1475]: time="2025-03-20T22:18:41.042461021Z" level=info msg="CreateContainer within sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 22:18:41.059086 containerd[1475]: time="2025-03-20T22:18:41.059025275Z" level=info msg="Container cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:41.062642 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1261872761.mount: Deactivated successfully. Mar 20 22:18:41.068107 kubelet[2812]: E0320 22:18:41.067928 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:41.091562 containerd[1475]: time="2025-03-20T22:18:41.091455974Z" level=info msg="CreateContainer within sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\"" Mar 20 22:18:41.091992 containerd[1475]: time="2025-03-20T22:18:41.091922360Z" level=info msg="StartContainer for \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\"" Mar 20 22:18:41.095149 containerd[1475]: time="2025-03-20T22:18:41.095106727Z" level=info msg="connecting to shim cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927" address="unix:///run/containerd/s/3e2209f4b12f79abe6965f00946e6dd89d1eb5e9f2b304235f25613b7f681bde" protocol=ttrpc version=3 Mar 20 22:18:41.132622 systemd[1]: Started cri-containerd-cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927.scope - libcontainer container cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927. Mar 20 22:18:41.184505 containerd[1475]: time="2025-03-20T22:18:41.184387764Z" level=info msg="StartContainer for \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" returns successfully" Mar 20 22:18:42.484704 containerd[1475]: time="2025-03-20T22:18:42.484600200Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 22:18:42.489981 systemd[1]: cri-containerd-cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927.scope: Deactivated successfully. Mar 20 22:18:42.490922 systemd[1]: cri-containerd-cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927.scope: Consumed 696ms CPU time, 174.8M memory peak, 154M written to disk. Mar 20 22:18:42.498627 containerd[1475]: time="2025-03-20T22:18:42.498325855Z" level=info msg="received exit event container_id:\"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" id:\"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" pid:3557 exited_at:{seconds:1742509122 nanos:497355282}" Mar 20 22:18:42.498903 containerd[1475]: time="2025-03-20T22:18:42.498798232Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" id:\"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" pid:3557 exited_at:{seconds:1742509122 nanos:497355282}" Mar 20 22:18:42.546606 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927-rootfs.mount: Deactivated successfully. Mar 20 22:18:42.920895 kubelet[2812]: I0320 22:18:42.554271 2812 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 20 22:18:42.959869 kubelet[2812]: I0320 22:18:42.959770 2812 topology_manager.go:215] "Topology Admit Handler" podUID="8258f556-4ff6-4fd9-a747-02759833a112" podNamespace="kube-system" podName="coredns-7db6d8ff4d-vp28l" Mar 20 22:18:42.964413 kubelet[2812]: I0320 22:18:42.964183 2812 topology_manager.go:215] "Topology Admit Handler" podUID="fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7" podNamespace="kube-system" podName="coredns-7db6d8ff4d-sg9pv" Mar 20 22:18:42.986374 kubelet[2812]: I0320 22:18:42.986286 2812 topology_manager.go:215] "Topology Admit Handler" podUID="fee3832f-4ccf-429d-9c96-987d3f4ad55c" podNamespace="calico-system" podName="calico-kube-controllers-59b6b788d9-v7kpt" Mar 20 22:18:42.987867 systemd[1]: Created slice kubepods-burstable-pod8258f556_4ff6_4fd9_a747_02759833a112.slice - libcontainer container kubepods-burstable-pod8258f556_4ff6_4fd9_a747_02759833a112.slice. Mar 20 22:18:42.995732 kubelet[2812]: I0320 22:18:42.993677 2812 topology_manager.go:215] "Topology Admit Handler" podUID="27f9c3fe-80a0-4438-93fd-b56cba5ba080" podNamespace="calico-apiserver" podName="calico-apiserver-cd595b98d-7cqjt" Mar 20 22:18:42.999531 kubelet[2812]: I0320 22:18:42.998838 2812 topology_manager.go:215] "Topology Admit Handler" podUID="1ceb24d7-7f94-4c55-9df4-2c4e23376391" podNamespace="calico-apiserver" podName="calico-apiserver-cd595b98d-spv2p" Mar 20 22:18:43.013097 systemd[1]: Created slice kubepods-burstable-podfb0fe2e3_f8bd_4402_b2fe_cec1a2ae3ae7.slice - libcontainer container kubepods-burstable-podfb0fe2e3_f8bd_4402_b2fe_cec1a2ae3ae7.slice. Mar 20 22:18:43.021168 systemd[1]: Created slice kubepods-besteffort-podfee3832f_4ccf_429d_9c96_987d3f4ad55c.slice - libcontainer container kubepods-besteffort-podfee3832f_4ccf_429d_9c96_987d3f4ad55c.slice. Mar 20 22:18:43.032852 systemd[1]: Created slice kubepods-besteffort-pod27f9c3fe_80a0_4438_93fd_b56cba5ba080.slice - libcontainer container kubepods-besteffort-pod27f9c3fe_80a0_4438_93fd_b56cba5ba080.slice. Mar 20 22:18:43.037925 systemd[1]: Created slice kubepods-besteffort-pod1ceb24d7_7f94_4c55_9df4_2c4e23376391.slice - libcontainer container kubepods-besteffort-pod1ceb24d7_7f94_4c55_9df4_2c4e23376391.slice. Mar 20 22:18:43.068384 systemd[1]: Created slice kubepods-besteffort-pod3068cdad_5d4c_43c1_adba_247b734e4e53.slice - libcontainer container kubepods-besteffort-pod3068cdad_5d4c_43c1_adba_247b734e4e53.slice. Mar 20 22:18:43.070601 containerd[1475]: time="2025-03-20T22:18:43.070572099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47bzn,Uid:3068cdad-5d4c-43c1-adba-247b734e4e53,Namespace:calico-system,Attempt:0,}" Mar 20 22:18:43.137993 kubelet[2812]: I0320 22:18:43.137869 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86t8w\" (UniqueName: \"kubernetes.io/projected/27f9c3fe-80a0-4438-93fd-b56cba5ba080-kube-api-access-86t8w\") pod \"calico-apiserver-cd595b98d-7cqjt\" (UID: \"27f9c3fe-80a0-4438-93fd-b56cba5ba080\") " pod="calico-apiserver/calico-apiserver-cd595b98d-7cqjt" Mar 20 22:18:43.138237 kubelet[2812]: I0320 22:18:43.138046 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bslb\" (UniqueName: \"kubernetes.io/projected/8258f556-4ff6-4fd9-a747-02759833a112-kube-api-access-5bslb\") pod \"coredns-7db6d8ff4d-vp28l\" (UID: \"8258f556-4ff6-4fd9-a747-02759833a112\") " pod="kube-system/coredns-7db6d8ff4d-vp28l" Mar 20 22:18:43.138237 kubelet[2812]: I0320 22:18:43.138151 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fee3832f-4ccf-429d-9c96-987d3f4ad55c-tigera-ca-bundle\") pod \"calico-kube-controllers-59b6b788d9-v7kpt\" (UID: \"fee3832f-4ccf-429d-9c96-987d3f4ad55c\") " pod="calico-system/calico-kube-controllers-59b6b788d9-v7kpt" Mar 20 22:18:43.138429 kubelet[2812]: I0320 22:18:43.138209 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8258f556-4ff6-4fd9-a747-02759833a112-config-volume\") pod \"coredns-7db6d8ff4d-vp28l\" (UID: \"8258f556-4ff6-4fd9-a747-02759833a112\") " pod="kube-system/coredns-7db6d8ff4d-vp28l" Mar 20 22:18:43.138429 kubelet[2812]: I0320 22:18:43.138316 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/27f9c3fe-80a0-4438-93fd-b56cba5ba080-calico-apiserver-certs\") pod \"calico-apiserver-cd595b98d-7cqjt\" (UID: \"27f9c3fe-80a0-4438-93fd-b56cba5ba080\") " pod="calico-apiserver/calico-apiserver-cd595b98d-7cqjt" Mar 20 22:18:43.138429 kubelet[2812]: I0320 22:18:43.138425 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fcb\" (UniqueName: \"kubernetes.io/projected/fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7-kube-api-access-85fcb\") pod \"coredns-7db6d8ff4d-sg9pv\" (UID: \"fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7\") " pod="kube-system/coredns-7db6d8ff4d-sg9pv" Mar 20 22:18:43.139122 kubelet[2812]: I0320 22:18:43.138541 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1ceb24d7-7f94-4c55-9df4-2c4e23376391-calico-apiserver-certs\") pod \"calico-apiserver-cd595b98d-spv2p\" (UID: \"1ceb24d7-7f94-4c55-9df4-2c4e23376391\") " pod="calico-apiserver/calico-apiserver-cd595b98d-spv2p" Mar 20 22:18:43.139122 kubelet[2812]: I0320 22:18:43.138800 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9jw\" (UniqueName: \"kubernetes.io/projected/1ceb24d7-7f94-4c55-9df4-2c4e23376391-kube-api-access-sn9jw\") pod \"calico-apiserver-cd595b98d-spv2p\" (UID: \"1ceb24d7-7f94-4c55-9df4-2c4e23376391\") " pod="calico-apiserver/calico-apiserver-cd595b98d-spv2p" Mar 20 22:18:43.139122 kubelet[2812]: I0320 22:18:43.138901 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7-config-volume\") pod \"coredns-7db6d8ff4d-sg9pv\" (UID: \"fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7\") " pod="kube-system/coredns-7db6d8ff4d-sg9pv" Mar 20 22:18:43.139122 kubelet[2812]: I0320 22:18:43.139014 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9sjv\" (UniqueName: \"kubernetes.io/projected/fee3832f-4ccf-429d-9c96-987d3f4ad55c-kube-api-access-m9sjv\") pod \"calico-kube-controllers-59b6b788d9-v7kpt\" (UID: \"fee3832f-4ccf-429d-9c96-987d3f4ad55c\") " pod="calico-system/calico-kube-controllers-59b6b788d9-v7kpt" Mar 20 22:18:43.602377 containerd[1475]: time="2025-03-20T22:18:43.601754908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vp28l,Uid:8258f556-4ff6-4fd9-a747-02759833a112,Namespace:kube-system,Attempt:0,}" Mar 20 22:18:43.618924 containerd[1475]: time="2025-03-20T22:18:43.618740109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-sg9pv,Uid:fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7,Namespace:kube-system,Attempt:0,}" Mar 20 22:18:43.630366 containerd[1475]: time="2025-03-20T22:18:43.630279779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b6b788d9-v7kpt,Uid:fee3832f-4ccf-429d-9c96-987d3f4ad55c,Namespace:calico-system,Attempt:0,}" Mar 20 22:18:43.636898 containerd[1475]: time="2025-03-20T22:18:43.636710358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-7cqjt,Uid:27f9c3fe-80a0-4438-93fd-b56cba5ba080,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:18:43.642367 containerd[1475]: time="2025-03-20T22:18:43.641984818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-spv2p,Uid:1ceb24d7-7f94-4c55-9df4-2c4e23376391,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:18:44.040188 containerd[1475]: time="2025-03-20T22:18:44.039705419Z" level=error msg="Failed to destroy network for sandbox \"127b55fc7b049b53b78bb37e42d60dbdede770d61667ed71f8be3b7d1a192f47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.043615 containerd[1475]: time="2025-03-20T22:18:44.043548582Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-sg9pv,Uid:fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"127b55fc7b049b53b78bb37e42d60dbdede770d61667ed71f8be3b7d1a192f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.043993 kubelet[2812]: E0320 22:18:44.043787 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"127b55fc7b049b53b78bb37e42d60dbdede770d61667ed71f8be3b7d1a192f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.043993 kubelet[2812]: E0320 22:18:44.043897 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"127b55fc7b049b53b78bb37e42d60dbdede770d61667ed71f8be3b7d1a192f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-sg9pv" Mar 20 22:18:44.043993 kubelet[2812]: E0320 22:18:44.043922 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"127b55fc7b049b53b78bb37e42d60dbdede770d61667ed71f8be3b7d1a192f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-sg9pv" Mar 20 22:18:44.044854 kubelet[2812]: E0320 22:18:44.043972 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-sg9pv_kube-system(fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-sg9pv_kube-system(fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"127b55fc7b049b53b78bb37e42d60dbdede770d61667ed71f8be3b7d1a192f47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-sg9pv" podUID="fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7" Mar 20 22:18:44.081061 containerd[1475]: time="2025-03-20T22:18:44.080938569Z" level=error msg="Failed to destroy network for sandbox \"5f4c2c8d18795b388c80aac7e867ad7d9da019ae4f8815e367d1b8921200e2da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.083033 containerd[1475]: time="2025-03-20T22:18:44.082931229Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47bzn,Uid:3068cdad-5d4c-43c1-adba-247b734e4e53,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f4c2c8d18795b388c80aac7e867ad7d9da019ae4f8815e367d1b8921200e2da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.083389 kubelet[2812]: E0320 22:18:44.083156 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f4c2c8d18795b388c80aac7e867ad7d9da019ae4f8815e367d1b8921200e2da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.083389 kubelet[2812]: E0320 22:18:44.083216 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f4c2c8d18795b388c80aac7e867ad7d9da019ae4f8815e367d1b8921200e2da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-47bzn" Mar 20 22:18:44.083389 kubelet[2812]: E0320 22:18:44.083239 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f4c2c8d18795b388c80aac7e867ad7d9da019ae4f8815e367d1b8921200e2da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-47bzn" Mar 20 22:18:44.083726 kubelet[2812]: E0320 22:18:44.083286 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-47bzn_calico-system(3068cdad-5d4c-43c1-adba-247b734e4e53)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-47bzn_calico-system(3068cdad-5d4c-43c1-adba-247b734e4e53)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f4c2c8d18795b388c80aac7e867ad7d9da019ae4f8815e367d1b8921200e2da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-47bzn" podUID="3068cdad-5d4c-43c1-adba-247b734e4e53" Mar 20 22:18:44.087831 containerd[1475]: time="2025-03-20T22:18:44.087577259Z" level=error msg="Failed to destroy network for sandbox \"c5dbf46d0d570f99cfaa9fdd2a38b032f1732a2025bfca2d59bdeed2c3bc57d3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.090027 containerd[1475]: time="2025-03-20T22:18:44.089895851Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-7cqjt,Uid:27f9c3fe-80a0-4438-93fd-b56cba5ba080,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5dbf46d0d570f99cfaa9fdd2a38b032f1732a2025bfca2d59bdeed2c3bc57d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.090522 kubelet[2812]: E0320 22:18:44.090142 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5dbf46d0d570f99cfaa9fdd2a38b032f1732a2025bfca2d59bdeed2c3bc57d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.090522 kubelet[2812]: E0320 22:18:44.090204 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5dbf46d0d570f99cfaa9fdd2a38b032f1732a2025bfca2d59bdeed2c3bc57d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd595b98d-7cqjt" Mar 20 22:18:44.090522 kubelet[2812]: E0320 22:18:44.090229 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5dbf46d0d570f99cfaa9fdd2a38b032f1732a2025bfca2d59bdeed2c3bc57d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd595b98d-7cqjt" Mar 20 22:18:44.090708 kubelet[2812]: E0320 22:18:44.090276 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cd595b98d-7cqjt_calico-apiserver(27f9c3fe-80a0-4438-93fd-b56cba5ba080)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cd595b98d-7cqjt_calico-apiserver(27f9c3fe-80a0-4438-93fd-b56cba5ba080)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c5dbf46d0d570f99cfaa9fdd2a38b032f1732a2025bfca2d59bdeed2c3bc57d3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cd595b98d-7cqjt" podUID="27f9c3fe-80a0-4438-93fd-b56cba5ba080" Mar 20 22:18:44.098251 containerd[1475]: time="2025-03-20T22:18:44.098003258Z" level=error msg="Failed to destroy network for sandbox \"481fd9a1ebd6a65cfeb0218e1d541a881cbaf12e50b8e54febbd333f340f5489\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.100056 containerd[1475]: time="2025-03-20T22:18:44.100019883Z" level=error msg="Failed to destroy network for sandbox \"a18142a3d397d44994eae7859448439d7846faa0bc07cae6b8ff0dd214ea6419\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.100460 containerd[1475]: time="2025-03-20T22:18:44.100422999Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b6b788d9-v7kpt,Uid:fee3832f-4ccf-429d-9c96-987d3f4ad55c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"481fd9a1ebd6a65cfeb0218e1d541a881cbaf12e50b8e54febbd333f340f5489\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.100843 kubelet[2812]: E0320 22:18:44.100674 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"481fd9a1ebd6a65cfeb0218e1d541a881cbaf12e50b8e54febbd333f340f5489\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.100843 kubelet[2812]: E0320 22:18:44.100730 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"481fd9a1ebd6a65cfeb0218e1d541a881cbaf12e50b8e54febbd333f340f5489\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59b6b788d9-v7kpt" Mar 20 22:18:44.100843 kubelet[2812]: E0320 22:18:44.100756 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"481fd9a1ebd6a65cfeb0218e1d541a881cbaf12e50b8e54febbd333f340f5489\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59b6b788d9-v7kpt" Mar 20 22:18:44.101410 kubelet[2812]: E0320 22:18:44.100801 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-59b6b788d9-v7kpt_calico-system(fee3832f-4ccf-429d-9c96-987d3f4ad55c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-59b6b788d9-v7kpt_calico-system(fee3832f-4ccf-429d-9c96-987d3f4ad55c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"481fd9a1ebd6a65cfeb0218e1d541a881cbaf12e50b8e54febbd333f340f5489\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-59b6b788d9-v7kpt" podUID="fee3832f-4ccf-429d-9c96-987d3f4ad55c" Mar 20 22:18:44.102996 containerd[1475]: time="2025-03-20T22:18:44.102545142Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vp28l,Uid:8258f556-4ff6-4fd9-a747-02759833a112,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a18142a3d397d44994eae7859448439d7846faa0bc07cae6b8ff0dd214ea6419\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.103347 kubelet[2812]: E0320 22:18:44.103202 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a18142a3d397d44994eae7859448439d7846faa0bc07cae6b8ff0dd214ea6419\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.103347 kubelet[2812]: E0320 22:18:44.103273 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a18142a3d397d44994eae7859448439d7846faa0bc07cae6b8ff0dd214ea6419\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-vp28l" Mar 20 22:18:44.103347 kubelet[2812]: E0320 22:18:44.103297 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a18142a3d397d44994eae7859448439d7846faa0bc07cae6b8ff0dd214ea6419\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-vp28l" Mar 20 22:18:44.103567 kubelet[2812]: E0320 22:18:44.103447 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-vp28l_kube-system(8258f556-4ff6-4fd9-a747-02759833a112)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-vp28l_kube-system(8258f556-4ff6-4fd9-a747-02759833a112)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a18142a3d397d44994eae7859448439d7846faa0bc07cae6b8ff0dd214ea6419\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-vp28l" podUID="8258f556-4ff6-4fd9-a747-02759833a112" Mar 20 22:18:44.114289 containerd[1475]: time="2025-03-20T22:18:44.114237166Z" level=error msg="Failed to destroy network for sandbox \"704b9aba16c143f4bd4d6d815dd70f7b4432ff35aac6567c75f8f8dde158bda4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.115809 containerd[1475]: time="2025-03-20T22:18:44.115763411Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-spv2p,Uid:1ceb24d7-7f94-4c55-9df4-2c4e23376391,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"704b9aba16c143f4bd4d6d815dd70f7b4432ff35aac6567c75f8f8dde158bda4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.116381 kubelet[2812]: E0320 22:18:44.115978 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"704b9aba16c143f4bd4d6d815dd70f7b4432ff35aac6567c75f8f8dde158bda4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:18:44.116381 kubelet[2812]: E0320 22:18:44.116029 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"704b9aba16c143f4bd4d6d815dd70f7b4432ff35aac6567c75f8f8dde158bda4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd595b98d-spv2p" Mar 20 22:18:44.116381 kubelet[2812]: E0320 22:18:44.116051 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"704b9aba16c143f4bd4d6d815dd70f7b4432ff35aac6567c75f8f8dde158bda4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd595b98d-spv2p" Mar 20 22:18:44.116573 kubelet[2812]: E0320 22:18:44.116102 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cd595b98d-spv2p_calico-apiserver(1ceb24d7-7f94-4c55-9df4-2c4e23376391)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cd595b98d-spv2p_calico-apiserver(1ceb24d7-7f94-4c55-9df4-2c4e23376391)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"704b9aba16c143f4bd4d6d815dd70f7b4432ff35aac6567c75f8f8dde158bda4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cd595b98d-spv2p" podUID="1ceb24d7-7f94-4c55-9df4-2c4e23376391" Mar 20 22:18:44.267122 containerd[1475]: time="2025-03-20T22:18:44.266674815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 20 22:18:44.543659 systemd[1]: run-netns-cni\x2d189e5ae3\x2d9b82\x2d3830\x2dbc42\x2d9842166715ed.mount: Deactivated successfully. Mar 20 22:18:44.543959 systemd[1]: run-netns-cni\x2d7f0d8563\x2d74d2\x2d7f11\x2dca7a\x2db243d088aedb.mount: Deactivated successfully. Mar 20 22:18:44.544145 systemd[1]: run-netns-cni\x2d44bf9ec9\x2d4fb8\x2d7950\x2dd66a\x2dab770a4fde8f.mount: Deactivated successfully. Mar 20 22:18:44.544304 systemd[1]: run-netns-cni\x2db35d5668\x2d373a\x2d9a05\x2d9dd1\x2dbb35347952cb.mount: Deactivated successfully. Mar 20 22:18:52.701691 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount77325908.mount: Deactivated successfully. Mar 20 22:18:52.940891 containerd[1475]: time="2025-03-20T22:18:52.940792452Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:52.943658 containerd[1475]: time="2025-03-20T22:18:52.943521724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 20 22:18:52.945702 containerd[1475]: time="2025-03-20T22:18:52.945562423Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:52.954660 containerd[1475]: time="2025-03-20T22:18:52.953259876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:52.956209 containerd[1475]: time="2025-03-20T22:18:52.955570812Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 8.688813704s" Mar 20 22:18:52.956209 containerd[1475]: time="2025-03-20T22:18:52.955646424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 20 22:18:52.993034 containerd[1475]: time="2025-03-20T22:18:52.992588459Z" level=info msg="CreateContainer within sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 22:18:53.025775 containerd[1475]: time="2025-03-20T22:18:53.025705536Z" level=info msg="Container 47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:53.058155 containerd[1475]: time="2025-03-20T22:18:53.058078367Z" level=info msg="CreateContainer within sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\"" Mar 20 22:18:53.059775 containerd[1475]: time="2025-03-20T22:18:53.058810460Z" level=info msg="StartContainer for \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\"" Mar 20 22:18:53.063404 containerd[1475]: time="2025-03-20T22:18:53.063351371Z" level=info msg="connecting to shim 47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21" address="unix:///run/containerd/s/3e2209f4b12f79abe6965f00946e6dd89d1eb5e9f2b304235f25613b7f681bde" protocol=ttrpc version=3 Mar 20 22:18:53.097161 systemd[1]: Started cri-containerd-47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21.scope - libcontainer container 47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21. Mar 20 22:18:53.149018 containerd[1475]: time="2025-03-20T22:18:53.148604459Z" level=info msg="StartContainer for \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" returns successfully" Mar 20 22:18:53.229241 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 20 22:18:53.229354 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 20 22:18:53.334114 kubelet[2812]: I0320 22:18:53.334043 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-zgtw4" podStartSLOduration=1.598944063 podStartE2EDuration="25.334028485s" podCreationTimestamp="2025-03-20 22:18:28 +0000 UTC" firstStartedPulling="2025-03-20 22:18:29.222545263 +0000 UTC m=+26.279727340" lastFinishedPulling="2025-03-20 22:18:52.957629635 +0000 UTC m=+50.014811762" observedRunningTime="2025-03-20 22:18:53.333027067 +0000 UTC m=+50.390209144" watchObservedRunningTime="2025-03-20 22:18:53.334028485 +0000 UTC m=+50.391210562" Mar 20 22:18:53.428461 containerd[1475]: time="2025-03-20T22:18:53.428408569Z" level=info msg="TaskExit event in podsandbox handler container_id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" id:\"f5cba981e8c0d4e5be347c699c9fb973e38581f30813a99102954181f2ce467c\" pid:3840 exit_status:1 exited_at:{seconds:1742509133 nanos:427650917}" Mar 20 22:18:54.432902 containerd[1475]: time="2025-03-20T22:18:54.432805092Z" level=info msg="TaskExit event in podsandbox handler container_id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" id:\"3b16eb1b909711865967fa67e4fd4544b2bb5fcdce408517775cb8fa3646ecb3\" pid:3877 exit_status:1 exited_at:{seconds:1742509134 nanos:432463440}" Mar 20 22:18:54.861544 kernel: bpftool[3993]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 20 22:18:55.066198 containerd[1475]: time="2025-03-20T22:18:55.065615745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47bzn,Uid:3068cdad-5d4c-43c1-adba-247b734e4e53,Namespace:calico-system,Attempt:0,}" Mar 20 22:18:55.066982 containerd[1475]: time="2025-03-20T22:18:55.066701903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-7cqjt,Uid:27f9c3fe-80a0-4438-93fd-b56cba5ba080,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:18:55.069282 containerd[1475]: time="2025-03-20T22:18:55.068756918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-spv2p,Uid:1ceb24d7-7f94-4c55-9df4-2c4e23376391,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:18:55.251634 systemd-networkd[1384]: vxlan.calico: Link UP Mar 20 22:18:55.251644 systemd-networkd[1384]: vxlan.calico: Gained carrier Mar 20 22:18:55.403010 systemd-networkd[1384]: cali93408e77350: Link UP Mar 20 22:18:55.403427 systemd-networkd[1384]: cali93408e77350: Gained carrier Mar 20 22:18:55.434711 containerd[1475]: 2025-03-20 22:18:55.218 [INFO][4013] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0 calico-apiserver-cd595b98d- calico-apiserver 27f9c3fe-80a0-4438-93fd-b56cba5ba080 742 0 2025-03-20 22:18:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cd595b98d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-2-b-c50fddf147.novalocal calico-apiserver-cd595b98d-7cqjt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali93408e77350 [] []}} ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-" Mar 20 22:18:55.434711 containerd[1475]: 2025-03-20 22:18:55.219 [INFO][4013] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" Mar 20 22:18:55.434711 containerd[1475]: 2025-03-20 22:18:55.324 [INFO][4071] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" HandleID="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.350 [INFO][4071] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" HandleID="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005e21e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-2-b-c50fddf147.novalocal", "pod":"calico-apiserver-cd595b98d-7cqjt", "timestamp":"2025-03-20 22:18:55.324884872 +0000 UTC"}, Hostname:"ci-9999-0-2-b-c50fddf147.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.350 [INFO][4071] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.350 [INFO][4071] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.350 [INFO][4071] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-b-c50fddf147.novalocal' Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.358 [INFO][4071] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.364 [INFO][4071] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.369 [INFO][4071] ipam/ipam.go 489: Trying affinity for 192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.371 [INFO][4071] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436703 containerd[1475]: 2025-03-20 22:18:55.374 [INFO][4071] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436962 containerd[1475]: 2025-03-20 22:18:55.374 [INFO][4071] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.64/26 handle="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436962 containerd[1475]: 2025-03-20 22:18:55.376 [INFO][4071] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038 Mar 20 22:18:55.436962 containerd[1475]: 2025-03-20 22:18:55.380 [INFO][4071] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.64/26 handle="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436962 containerd[1475]: 2025-03-20 22:18:55.388 [INFO][4071] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.65/26] block=192.168.50.64/26 handle="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436962 containerd[1475]: 2025-03-20 22:18:55.388 [INFO][4071] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.65/26] handle="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.436962 containerd[1475]: 2025-03-20 22:18:55.388 [INFO][4071] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:18:55.436962 containerd[1475]: 2025-03-20 22:18:55.388 [INFO][4071] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.65/26] IPv6=[] ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" HandleID="k8s-pod-network.b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" Mar 20 22:18:55.437231 containerd[1475]: 2025-03-20 22:18:55.394 [INFO][4013] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0", GenerateName:"calico-apiserver-cd595b98d-", Namespace:"calico-apiserver", SelfLink:"", UID:"27f9c3fe-80a0-4438-93fd-b56cba5ba080", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd595b98d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"", Pod:"calico-apiserver-cd595b98d-7cqjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali93408e77350", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:55.437301 containerd[1475]: 2025-03-20 22:18:55.394 [INFO][4013] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.65/32] ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" Mar 20 22:18:55.437301 containerd[1475]: 2025-03-20 22:18:55.394 [INFO][4013] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali93408e77350 ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" Mar 20 22:18:55.437301 containerd[1475]: 2025-03-20 22:18:55.403 [INFO][4013] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" Mar 20 22:18:55.437386 containerd[1475]: 2025-03-20 22:18:55.405 [INFO][4013] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0", GenerateName:"calico-apiserver-cd595b98d-", Namespace:"calico-apiserver", SelfLink:"", UID:"27f9c3fe-80a0-4438-93fd-b56cba5ba080", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd595b98d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038", Pod:"calico-apiserver-cd595b98d-7cqjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali93408e77350", MAC:"66:58:22:bd:d9:06", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:55.437454 containerd[1475]: 2025-03-20 22:18:55.430 [INFO][4013] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-7cqjt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--7cqjt-eth0" Mar 20 22:18:55.465616 systemd-networkd[1384]: cali43df655fe46: Link UP Mar 20 22:18:55.467765 systemd-networkd[1384]: cali43df655fe46: Gained carrier Mar 20 22:18:55.500444 containerd[1475]: 2025-03-20 22:18:55.219 [INFO][4019] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0 csi-node-driver- calico-system 3068cdad-5d4c-43c1-adba-247b734e4e53 599 0 2025-03-20 22:18:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-9999-0-2-b-c50fddf147.novalocal csi-node-driver-47bzn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali43df655fe46 [] []}} ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-" Mar 20 22:18:55.500444 containerd[1475]: 2025-03-20 22:18:55.219 [INFO][4019] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" Mar 20 22:18:55.500444 containerd[1475]: 2025-03-20 22:18:55.317 [INFO][4064] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" HandleID="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.360 [INFO][4064] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" HandleID="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000264ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-2-b-c50fddf147.novalocal", "pod":"csi-node-driver-47bzn", "timestamp":"2025-03-20 22:18:55.317327833 +0000 UTC"}, Hostname:"ci-9999-0-2-b-c50fddf147.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.360 [INFO][4064] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.388 [INFO][4064] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.390 [INFO][4064] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-b-c50fddf147.novalocal' Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.392 [INFO][4064] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.398 [INFO][4064] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.410 [INFO][4064] ipam/ipam.go 489: Trying affinity for 192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.418 [INFO][4064] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.500783 containerd[1475]: 2025-03-20 22:18:55.422 [INFO][4064] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.501027 containerd[1475]: 2025-03-20 22:18:55.422 [INFO][4064] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.64/26 handle="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.501027 containerd[1475]: 2025-03-20 22:18:55.425 [INFO][4064] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a Mar 20 22:18:55.501027 containerd[1475]: 2025-03-20 22:18:55.440 [INFO][4064] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.64/26 handle="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.501027 containerd[1475]: 2025-03-20 22:18:55.455 [INFO][4064] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.66/26] block=192.168.50.64/26 handle="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.501027 containerd[1475]: 2025-03-20 22:18:55.456 [INFO][4064] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.66/26] handle="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.501027 containerd[1475]: 2025-03-20 22:18:55.457 [INFO][4064] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:18:55.501027 containerd[1475]: 2025-03-20 22:18:55.457 [INFO][4064] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.66/26] IPv6=[] ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" HandleID="k8s-pod-network.605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" Mar 20 22:18:55.501194 containerd[1475]: 2025-03-20 22:18:55.461 [INFO][4019] cni-plugin/k8s.go 386: Populated endpoint ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3068cdad-5d4c-43c1-adba-247b734e4e53", ResourceVersion:"599", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"", Pod:"csi-node-driver-47bzn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.50.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali43df655fe46", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:55.501266 containerd[1475]: 2025-03-20 22:18:55.461 [INFO][4019] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.66/32] ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" Mar 20 22:18:55.501266 containerd[1475]: 2025-03-20 22:18:55.461 [INFO][4019] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43df655fe46 ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" Mar 20 22:18:55.501266 containerd[1475]: 2025-03-20 22:18:55.470 [INFO][4019] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" Mar 20 22:18:55.503025 containerd[1475]: 2025-03-20 22:18:55.471 [INFO][4019] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3068cdad-5d4c-43c1-adba-247b734e4e53", ResourceVersion:"599", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a", Pod:"csi-node-driver-47bzn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.50.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali43df655fe46", MAC:"72:59:8c:1a:a9:82", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:55.503151 containerd[1475]: 2025-03-20 22:18:55.496 [INFO][4019] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" Namespace="calico-system" Pod="csi-node-driver-47bzn" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-csi--node--driver--47bzn-eth0" Mar 20 22:18:55.534248 systemd-networkd[1384]: cali2672045a641: Link UP Mar 20 22:18:55.535638 systemd-networkd[1384]: cali2672045a641: Gained carrier Mar 20 22:18:55.573233 containerd[1475]: 2025-03-20 22:18:55.219 [INFO][4027] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0 calico-apiserver-cd595b98d- calico-apiserver 1ceb24d7-7f94-4c55-9df4-2c4e23376391 743 0 2025-03-20 22:18:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cd595b98d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-2-b-c50fddf147.novalocal calico-apiserver-cd595b98d-spv2p eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2672045a641 [] []}} ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-" Mar 20 22:18:55.573233 containerd[1475]: 2025-03-20 22:18:55.219 [INFO][4027] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" Mar 20 22:18:55.573233 containerd[1475]: 2025-03-20 22:18:55.346 [INFO][4069] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" HandleID="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.364 [INFO][4069] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" HandleID="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bd2b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-2-b-c50fddf147.novalocal", "pod":"calico-apiserver-cd595b98d-spv2p", "timestamp":"2025-03-20 22:18:55.346532356 +0000 UTC"}, Hostname:"ci-9999-0-2-b-c50fddf147.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.365 [INFO][4069] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.457 [INFO][4069] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.457 [INFO][4069] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-b-c50fddf147.novalocal' Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.461 [INFO][4069] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.470 [INFO][4069] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.483 [INFO][4069] ipam/ipam.go 489: Trying affinity for 192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.492 [INFO][4069] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573496 containerd[1475]: 2025-03-20 22:18:55.502 [INFO][4069] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573775 containerd[1475]: 2025-03-20 22:18:55.504 [INFO][4069] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.64/26 handle="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573775 containerd[1475]: 2025-03-20 22:18:55.508 [INFO][4069] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360 Mar 20 22:18:55.573775 containerd[1475]: 2025-03-20 22:18:55.516 [INFO][4069] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.64/26 handle="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573775 containerd[1475]: 2025-03-20 22:18:55.526 [INFO][4069] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.67/26] block=192.168.50.64/26 handle="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573775 containerd[1475]: 2025-03-20 22:18:55.526 [INFO][4069] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.67/26] handle="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:55.573775 containerd[1475]: 2025-03-20 22:18:55.526 [INFO][4069] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:18:55.573775 containerd[1475]: 2025-03-20 22:18:55.526 [INFO][4069] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.67/26] IPv6=[] ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" HandleID="k8s-pod-network.52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" Mar 20 22:18:55.574015 containerd[1475]: 2025-03-20 22:18:55.530 [INFO][4027] cni-plugin/k8s.go 386: Populated endpoint ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0", GenerateName:"calico-apiserver-cd595b98d-", Namespace:"calico-apiserver", SelfLink:"", UID:"1ceb24d7-7f94-4c55-9df4-2c4e23376391", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd595b98d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"", Pod:"calico-apiserver-cd595b98d-spv2p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2672045a641", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:55.574092 containerd[1475]: 2025-03-20 22:18:55.530 [INFO][4027] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.67/32] ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" Mar 20 22:18:55.574092 containerd[1475]: 2025-03-20 22:18:55.530 [INFO][4027] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2672045a641 ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" Mar 20 22:18:55.574092 containerd[1475]: 2025-03-20 22:18:55.532 [INFO][4027] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" Mar 20 22:18:55.574181 containerd[1475]: 2025-03-20 22:18:55.533 [INFO][4027] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0", GenerateName:"calico-apiserver-cd595b98d-", Namespace:"calico-apiserver", SelfLink:"", UID:"1ceb24d7-7f94-4c55-9df4-2c4e23376391", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd595b98d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360", Pod:"calico-apiserver-cd595b98d-spv2p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2672045a641", MAC:"22:90:fe:17:23:5a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:55.574258 containerd[1475]: 2025-03-20 22:18:55.562 [INFO][4027] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" Namespace="calico-apiserver" Pod="calico-apiserver-cd595b98d-spv2p" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--apiserver--cd595b98d--spv2p-eth0" Mar 20 22:18:55.581366 containerd[1475]: time="2025-03-20T22:18:55.579160719Z" level=info msg="connecting to shim 605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a" address="unix:///run/containerd/s/191e1846d104b4f226d7971ea37440b3bac5d3053758eb03f7600ec1ef07da65" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:55.581366 containerd[1475]: time="2025-03-20T22:18:55.579467204Z" level=info msg="connecting to shim b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038" address="unix:///run/containerd/s/6b51f6692ac6b7c0ad0d44e25ef559618e57adca98937c5f09c5987a55909202" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:55.621499 containerd[1475]: time="2025-03-20T22:18:55.620349576Z" level=info msg="connecting to shim 52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360" address="unix:///run/containerd/s/efde4775582442544b2ece4cbc4a819b88406f32b39e8f619731f213dd78225a" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:55.660200 systemd[1]: Started cri-containerd-b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038.scope - libcontainer container b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038. Mar 20 22:18:55.669716 systemd[1]: Started cri-containerd-605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a.scope - libcontainer container 605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a. Mar 20 22:18:55.677394 systemd[1]: Started cri-containerd-52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360.scope - libcontainer container 52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360. Mar 20 22:18:55.735719 containerd[1475]: time="2025-03-20T22:18:55.735651544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-47bzn,Uid:3068cdad-5d4c-43c1-adba-247b734e4e53,Namespace:calico-system,Attempt:0,} returns sandbox id \"605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a\"" Mar 20 22:18:55.770229 containerd[1475]: time="2025-03-20T22:18:55.770014957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 20 22:18:55.805592 containerd[1475]: time="2025-03-20T22:18:55.805148524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-7cqjt,Uid:27f9c3fe-80a0-4438-93fd-b56cba5ba080,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038\"" Mar 20 22:18:55.817644 containerd[1475]: time="2025-03-20T22:18:55.817460112Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd595b98d-spv2p,Uid:1ceb24d7-7f94-4c55-9df4-2c4e23376391,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360\"" Mar 20 22:18:56.063191 containerd[1475]: time="2025-03-20T22:18:56.062931648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-sg9pv,Uid:fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7,Namespace:kube-system,Attempt:0,}" Mar 20 22:18:56.260701 systemd-networkd[1384]: calib92d99c7931: Link UP Mar 20 22:18:56.261864 systemd-networkd[1384]: calib92d99c7931: Gained carrier Mar 20 22:18:56.282506 containerd[1475]: 2025-03-20 22:18:56.128 [INFO][4309] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0 coredns-7db6d8ff4d- kube-system fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7 744 0 2025-03-20 22:18:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-2-b-c50fddf147.novalocal coredns-7db6d8ff4d-sg9pv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib92d99c7931 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-" Mar 20 22:18:56.282506 containerd[1475]: 2025-03-20 22:18:56.129 [INFO][4309] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" Mar 20 22:18:56.282506 containerd[1475]: 2025-03-20 22:18:56.168 [INFO][4321] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" HandleID="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.184 [INFO][4321] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" HandleID="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b590), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-2-b-c50fddf147.novalocal", "pod":"coredns-7db6d8ff4d-sg9pv", "timestamp":"2025-03-20 22:18:56.16871847 +0000 UTC"}, Hostname:"ci-9999-0-2-b-c50fddf147.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.184 [INFO][4321] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.184 [INFO][4321] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.184 [INFO][4321] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-b-c50fddf147.novalocal' Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.187 [INFO][4321] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.194 [INFO][4321] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.204 [INFO][4321] ipam/ipam.go 489: Trying affinity for 192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.208 [INFO][4321] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282761 containerd[1475]: 2025-03-20 22:18:56.213 [INFO][4321] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282999 containerd[1475]: 2025-03-20 22:18:56.213 [INFO][4321] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.64/26 handle="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282999 containerd[1475]: 2025-03-20 22:18:56.216 [INFO][4321] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6 Mar 20 22:18:56.282999 containerd[1475]: 2025-03-20 22:18:56.224 [INFO][4321] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.64/26 handle="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282999 containerd[1475]: 2025-03-20 22:18:56.237 [INFO][4321] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.68/26] block=192.168.50.64/26 handle="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282999 containerd[1475]: 2025-03-20 22:18:56.237 [INFO][4321] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.68/26] handle="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:56.282999 containerd[1475]: 2025-03-20 22:18:56.238 [INFO][4321] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:18:56.282999 containerd[1475]: 2025-03-20 22:18:56.238 [INFO][4321] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.68/26] IPv6=[] ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" HandleID="k8s-pod-network.6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" Mar 20 22:18:56.283166 containerd[1475]: 2025-03-20 22:18:56.245 [INFO][4309] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-sg9pv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib92d99c7931", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:56.283166 containerd[1475]: 2025-03-20 22:18:56.245 [INFO][4309] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.68/32] ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" Mar 20 22:18:56.283166 containerd[1475]: 2025-03-20 22:18:56.245 [INFO][4309] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib92d99c7931 ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" Mar 20 22:18:56.283166 containerd[1475]: 2025-03-20 22:18:56.262 [INFO][4309] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" Mar 20 22:18:56.283166 containerd[1475]: 2025-03-20 22:18:56.263 [INFO][4309] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6", Pod:"coredns-7db6d8ff4d-sg9pv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib92d99c7931", MAC:"ea:a0:94:cd:bf:56", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:56.283166 containerd[1475]: 2025-03-20 22:18:56.280 [INFO][4309] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-sg9pv" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--sg9pv-eth0" Mar 20 22:18:56.334926 containerd[1475]: time="2025-03-20T22:18:56.334789272Z" level=info msg="connecting to shim 6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6" address="unix:///run/containerd/s/27f575125016590733611f0c3b84c93ff6334c3eaedbf6a0346fdbc32a4ace5f" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:56.378690 systemd[1]: Started cri-containerd-6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6.scope - libcontainer container 6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6. Mar 20 22:18:56.380722 systemd-networkd[1384]: vxlan.calico: Gained IPv6LL Mar 20 22:18:56.444613 containerd[1475]: time="2025-03-20T22:18:56.444574444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-sg9pv,Uid:fb0fe2e3-f8bd-4402-b2fe-cec1a2ae3ae7,Namespace:kube-system,Attempt:0,} returns sandbox id \"6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6\"" Mar 20 22:18:56.461567 containerd[1475]: time="2025-03-20T22:18:56.461522922Z" level=info msg="CreateContainer within sandbox \"6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 22:18:56.481168 containerd[1475]: time="2025-03-20T22:18:56.481026583Z" level=info msg="Container 8b89f233022436ebdb4f9115f5077396fec4d8b9e8cd4834340d70bf61c99c66: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:56.484376 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount43823156.mount: Deactivated successfully. Mar 20 22:18:56.494344 containerd[1475]: time="2025-03-20T22:18:56.494220287Z" level=info msg="CreateContainer within sandbox \"6fe9d8ea8ec3dc9ed24ab485afe9e0eb7c5bf998e5b07bde8a5bf7e9748feea6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8b89f233022436ebdb4f9115f5077396fec4d8b9e8cd4834340d70bf61c99c66\"" Mar 20 22:18:56.495062 containerd[1475]: time="2025-03-20T22:18:56.494966376Z" level=info msg="StartContainer for \"8b89f233022436ebdb4f9115f5077396fec4d8b9e8cd4834340d70bf61c99c66\"" Mar 20 22:18:56.497091 containerd[1475]: time="2025-03-20T22:18:56.496888602Z" level=info msg="connecting to shim 8b89f233022436ebdb4f9115f5077396fec4d8b9e8cd4834340d70bf61c99c66" address="unix:///run/containerd/s/27f575125016590733611f0c3b84c93ff6334c3eaedbf6a0346fdbc32a4ace5f" protocol=ttrpc version=3 Mar 20 22:18:56.525633 systemd[1]: Started cri-containerd-8b89f233022436ebdb4f9115f5077396fec4d8b9e8cd4834340d70bf61c99c66.scope - libcontainer container 8b89f233022436ebdb4f9115f5077396fec4d8b9e8cd4834340d70bf61c99c66. Mar 20 22:18:56.560058 containerd[1475]: time="2025-03-20T22:18:56.560002884Z" level=info msg="StartContainer for \"8b89f233022436ebdb4f9115f5077396fec4d8b9e8cd4834340d70bf61c99c66\" returns successfully" Mar 20 22:18:56.956942 systemd-networkd[1384]: cali2672045a641: Gained IPv6LL Mar 20 22:18:57.020833 systemd-networkd[1384]: cali43df655fe46: Gained IPv6LL Mar 20 22:18:57.065534 containerd[1475]: time="2025-03-20T22:18:57.063857342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vp28l,Uid:8258f556-4ff6-4fd9-a747-02759833a112,Namespace:kube-system,Attempt:0,}" Mar 20 22:18:57.071297 containerd[1475]: time="2025-03-20T22:18:57.071225296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b6b788d9-v7kpt,Uid:fee3832f-4ccf-429d-9c96-987d3f4ad55c,Namespace:calico-system,Attempt:0,}" Mar 20 22:18:57.290555 systemd-networkd[1384]: calia5bb964c134: Link UP Mar 20 22:18:57.291747 systemd-networkd[1384]: calia5bb964c134: Gained carrier Mar 20 22:18:57.404611 systemd-networkd[1384]: cali93408e77350: Gained IPv6LL Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.169 [INFO][4423] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0 calico-kube-controllers-59b6b788d9- calico-system fee3832f-4ccf-429d-9c96-987d3f4ad55c 741 0 2025-03-20 22:18:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:59b6b788d9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-0-2-b-c50fddf147.novalocal calico-kube-controllers-59b6b788d9-v7kpt eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calia5bb964c134 [] []}} ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.169 [INFO][4423] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.221 [INFO][4448] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.235 [INFO][4448] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000291110), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-2-b-c50fddf147.novalocal", "pod":"calico-kube-controllers-59b6b788d9-v7kpt", "timestamp":"2025-03-20 22:18:57.221034142 +0000 UTC"}, Hostname:"ci-9999-0-2-b-c50fddf147.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.235 [INFO][4448] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.235 [INFO][4448] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.235 [INFO][4448] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-b-c50fddf147.novalocal' Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.238 [INFO][4448] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.243 [INFO][4448] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.248 [INFO][4448] ipam/ipam.go 489: Trying affinity for 192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.250 [INFO][4448] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.253 [INFO][4448] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.253 [INFO][4448] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.64/26 handle="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.255 [INFO][4448] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596 Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.265 [INFO][4448] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.64/26 handle="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.282 [INFO][4448] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.69/26] block=192.168.50.64/26 handle="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.282 [INFO][4448] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.69/26] handle="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.283 [INFO][4448] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:18:57.448343 containerd[1475]: 2025-03-20 22:18:57.283 [INFO][4448] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.69/26] IPv6=[] ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:18:57.450889 containerd[1475]: 2025-03-20 22:18:57.286 [INFO][4423] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0", GenerateName:"calico-kube-controllers-59b6b788d9-", Namespace:"calico-system", SelfLink:"", UID:"fee3832f-4ccf-429d-9c96-987d3f4ad55c", ResourceVersion:"741", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59b6b788d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"", Pod:"calico-kube-controllers-59b6b788d9-v7kpt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.50.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia5bb964c134", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:57.450889 containerd[1475]: 2025-03-20 22:18:57.286 [INFO][4423] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.69/32] ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:18:57.450889 containerd[1475]: 2025-03-20 22:18:57.286 [INFO][4423] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia5bb964c134 ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:18:57.450889 containerd[1475]: 2025-03-20 22:18:57.293 [INFO][4423] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:18:57.450889 containerd[1475]: 2025-03-20 22:18:57.293 [INFO][4423] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0", GenerateName:"calico-kube-controllers-59b6b788d9-", Namespace:"calico-system", SelfLink:"", UID:"fee3832f-4ccf-429d-9c96-987d3f4ad55c", ResourceVersion:"741", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59b6b788d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596", Pod:"calico-kube-controllers-59b6b788d9-v7kpt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.50.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia5bb964c134", MAC:"d2:84:4a:41:92:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:57.450889 containerd[1475]: 2025-03-20 22:18:57.445 [INFO][4423] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Namespace="calico-system" Pod="calico-kube-controllers-59b6b788d9-v7kpt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:18:57.523534 systemd-networkd[1384]: cali4338c116557: Link UP Mar 20 22:18:57.524418 systemd-networkd[1384]: cali4338c116557: Gained carrier Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.181 [INFO][4421] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0 coredns-7db6d8ff4d- kube-system 8258f556-4ff6-4fd9-a747-02759833a112 735 0 2025-03-20 22:18:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-2-b-c50fddf147.novalocal coredns-7db6d8ff4d-vp28l eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4338c116557 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.182 [INFO][4421] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.224 [INFO][4453] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" HandleID="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.237 [INFO][4453] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" HandleID="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290b20), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-2-b-c50fddf147.novalocal", "pod":"coredns-7db6d8ff4d-vp28l", "timestamp":"2025-03-20 22:18:57.224430253 +0000 UTC"}, Hostname:"ci-9999-0-2-b-c50fddf147.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.237 [INFO][4453] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.283 [INFO][4453] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.283 [INFO][4453] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-b-c50fddf147.novalocal' Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.286 [INFO][4453] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.296 [INFO][4453] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.443 [INFO][4453] ipam/ipam.go 489: Trying affinity for 192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.450 [INFO][4453] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.456 [INFO][4453] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.456 [INFO][4453] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.64/26 handle="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.459 [INFO][4453] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.480 [INFO][4453] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.64/26 handle="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.517 [INFO][4453] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.70/26] block=192.168.50.64/26 handle="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.517 [INFO][4453] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.70/26] handle="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.517 [INFO][4453] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:18:57.620556 containerd[1475]: 2025-03-20 22:18:57.517 [INFO][4453] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.70/26] IPv6=[] ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" HandleID="k8s-pod-network.5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" Mar 20 22:18:57.634389 containerd[1475]: 2025-03-20 22:18:57.519 [INFO][4421] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"8258f556-4ff6-4fd9-a747-02759833a112", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-vp28l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4338c116557", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:57.634389 containerd[1475]: 2025-03-20 22:18:57.520 [INFO][4421] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.70/32] ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" Mar 20 22:18:57.634389 containerd[1475]: 2025-03-20 22:18:57.520 [INFO][4421] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4338c116557 ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" Mar 20 22:18:57.634389 containerd[1475]: 2025-03-20 22:18:57.524 [INFO][4421] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" Mar 20 22:18:57.634389 containerd[1475]: 2025-03-20 22:18:57.524 [INFO][4421] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"8258f556-4ff6-4fd9-a747-02759833a112", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 18, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff", Pod:"coredns-7db6d8ff4d-vp28l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4338c116557", MAC:"6a:c9:ab:6b:93:db", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:18:57.634389 containerd[1475]: 2025-03-20 22:18:57.613 [INFO][4421] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vp28l" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-coredns--7db6d8ff4d--vp28l-eth0" Mar 20 22:18:57.855874 systemd-networkd[1384]: calib92d99c7931: Gained IPv6LL Mar 20 22:18:58.128361 kubelet[2812]: I0320 22:18:58.126402 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-sg9pv" podStartSLOduration=41.126369707 podStartE2EDuration="41.126369707s" podCreationTimestamp="2025-03-20 22:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:18:58.125952794 +0000 UTC m=+55.183134921" watchObservedRunningTime="2025-03-20 22:18:58.126369707 +0000 UTC m=+55.183551834" Mar 20 22:18:58.331698 containerd[1475]: time="2025-03-20T22:18:58.330908855Z" level=info msg="connecting to shim e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" address="unix:///run/containerd/s/2269505c7237145fb5c37971c4ceebc472adc4558a2861a1081e0e7f770ee395" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:58.345723 containerd[1475]: time="2025-03-20T22:18:58.345667713Z" level=info msg="connecting to shim 5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff" address="unix:///run/containerd/s/9cac98e5a6e9e56482d6fa0d0e444a683ece64fe689bea8b31b9b1cb5dded736" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:18:58.366141 systemd-networkd[1384]: calia5bb964c134: Gained IPv6LL Mar 20 22:18:58.387153 systemd[1]: Started cri-containerd-5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff.scope - libcontainer container 5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff. Mar 20 22:18:58.388830 systemd[1]: Started cri-containerd-e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596.scope - libcontainer container e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596. Mar 20 22:18:58.474062 containerd[1475]: time="2025-03-20T22:18:58.474024238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vp28l,Uid:8258f556-4ff6-4fd9-a747-02759833a112,Namespace:kube-system,Attempt:0,} returns sandbox id \"5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff\"" Mar 20 22:18:58.485518 containerd[1475]: time="2025-03-20T22:18:58.484920951Z" level=info msg="CreateContainer within sandbox \"5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 22:18:58.498851 containerd[1475]: time="2025-03-20T22:18:58.498803576Z" level=info msg="Container f426c140f2d5048dae69f0e0e6e7f239a4213dc61650c070c5091712056a5dea: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:58.506052 containerd[1475]: time="2025-03-20T22:18:58.505969560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b6b788d9-v7kpt,Uid:fee3832f-4ccf-429d-9c96-987d3f4ad55c,Namespace:calico-system,Attempt:0,} returns sandbox id \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\"" Mar 20 22:18:58.516035 containerd[1475]: time="2025-03-20T22:18:58.515994436Z" level=info msg="CreateContainer within sandbox \"5743a9f276f1dd20ea6a13bcaa21c1212ce378a8c65dfa110956be3f508013ff\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f426c140f2d5048dae69f0e0e6e7f239a4213dc61650c070c5091712056a5dea\"" Mar 20 22:18:58.517292 containerd[1475]: time="2025-03-20T22:18:58.517264169Z" level=info msg="StartContainer for \"f426c140f2d5048dae69f0e0e6e7f239a4213dc61650c070c5091712056a5dea\"" Mar 20 22:18:58.520202 containerd[1475]: time="2025-03-20T22:18:58.519880176Z" level=info msg="connecting to shim f426c140f2d5048dae69f0e0e6e7f239a4213dc61650c070c5091712056a5dea" address="unix:///run/containerd/s/9cac98e5a6e9e56482d6fa0d0e444a683ece64fe689bea8b31b9b1cb5dded736" protocol=ttrpc version=3 Mar 20 22:18:58.540902 systemd[1]: Started cri-containerd-f426c140f2d5048dae69f0e0e6e7f239a4213dc61650c070c5091712056a5dea.scope - libcontainer container f426c140f2d5048dae69f0e0e6e7f239a4213dc61650c070c5091712056a5dea. Mar 20 22:18:58.589599 containerd[1475]: time="2025-03-20T22:18:58.589219781Z" level=info msg="StartContainer for \"f426c140f2d5048dae69f0e0e6e7f239a4213dc61650c070c5091712056a5dea\" returns successfully" Mar 20 22:18:58.615009 containerd[1475]: time="2025-03-20T22:18:58.614950314Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:58.617305 containerd[1475]: time="2025-03-20T22:18:58.617256350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 20 22:18:58.622497 containerd[1475]: time="2025-03-20T22:18:58.617650339Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:58.622497 containerd[1475]: time="2025-03-20T22:18:58.621487769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:18:58.622497 containerd[1475]: time="2025-03-20T22:18:58.622163556Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.852098315s" Mar 20 22:18:58.622497 containerd[1475]: time="2025-03-20T22:18:58.622189846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 20 22:18:58.624906 containerd[1475]: time="2025-03-20T22:18:58.624873371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 22:18:58.626306 containerd[1475]: time="2025-03-20T22:18:58.626257487Z" level=info msg="CreateContainer within sandbox \"605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 20 22:18:58.639757 containerd[1475]: time="2025-03-20T22:18:58.639307891Z" level=info msg="Container edb6aa578885f058f61a48917a1a63e6ddc2ba267b1df98e4e4cd956c65ad3bf: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:18:58.654308 containerd[1475]: time="2025-03-20T22:18:58.654193527Z" level=info msg="CreateContainer within sandbox \"605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"edb6aa578885f058f61a48917a1a63e6ddc2ba267b1df98e4e4cd956c65ad3bf\"" Mar 20 22:18:58.654920 containerd[1475]: time="2025-03-20T22:18:58.654885666Z" level=info msg="StartContainer for \"edb6aa578885f058f61a48917a1a63e6ddc2ba267b1df98e4e4cd956c65ad3bf\"" Mar 20 22:18:58.656526 containerd[1475]: time="2025-03-20T22:18:58.656491058Z" level=info msg="connecting to shim edb6aa578885f058f61a48917a1a63e6ddc2ba267b1df98e4e4cd956c65ad3bf" address="unix:///run/containerd/s/191e1846d104b4f226d7971ea37440b3bac5d3053758eb03f7600ec1ef07da65" protocol=ttrpc version=3 Mar 20 22:18:58.681611 systemd[1]: Started cri-containerd-edb6aa578885f058f61a48917a1a63e6ddc2ba267b1df98e4e4cd956c65ad3bf.scope - libcontainer container edb6aa578885f058f61a48917a1a63e6ddc2ba267b1df98e4e4cd956c65ad3bf. Mar 20 22:18:58.737415 containerd[1475]: time="2025-03-20T22:18:58.737070750Z" level=info msg="StartContainer for \"edb6aa578885f058f61a48917a1a63e6ddc2ba267b1df98e4e4cd956c65ad3bf\" returns successfully" Mar 20 22:18:58.754931 kubelet[2812]: I0320 22:18:58.754881 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-vp28l" podStartSLOduration=41.75486331 podStartE2EDuration="41.75486331s" podCreationTimestamp="2025-03-20 22:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:18:58.734691296 +0000 UTC m=+55.791873403" watchObservedRunningTime="2025-03-20 22:18:58.75486331 +0000 UTC m=+55.812045397" Mar 20 22:18:59.580819 systemd-networkd[1384]: cali4338c116557: Gained IPv6LL Mar 20 22:19:03.354262 containerd[1475]: time="2025-03-20T22:19:03.354149168Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:03.355859 containerd[1475]: time="2025-03-20T22:19:03.355723021Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 20 22:19:03.357711 containerd[1475]: time="2025-03-20T22:19:03.357150107Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:03.360401 containerd[1475]: time="2025-03-20T22:19:03.360284136Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:03.361120 containerd[1475]: time="2025-03-20T22:19:03.360701129Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 4.735791971s" Mar 20 22:19:03.361120 containerd[1475]: time="2025-03-20T22:19:03.360747747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 22:19:03.366560 containerd[1475]: time="2025-03-20T22:19:03.365464194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 22:19:03.368695 containerd[1475]: time="2025-03-20T22:19:03.368591440Z" level=info msg="CreateContainer within sandbox \"b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 22:19:03.387531 containerd[1475]: time="2025-03-20T22:19:03.387388301Z" level=info msg="Container 8d3be2afbb2470ff03dcdc20ef04da7dc8467f4861e7c7552b16682040c233b8: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:03.407847 containerd[1475]: time="2025-03-20T22:19:03.407785835Z" level=info msg="CreateContainer within sandbox \"b524433ab74183ef0e48b8a919630d61669324cac967ae54cb02b4bc1b12b038\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8d3be2afbb2470ff03dcdc20ef04da7dc8467f4861e7c7552b16682040c233b8\"" Mar 20 22:19:03.409247 containerd[1475]: time="2025-03-20T22:19:03.409205097Z" level=info msg="StartContainer for \"8d3be2afbb2470ff03dcdc20ef04da7dc8467f4861e7c7552b16682040c233b8\"" Mar 20 22:19:03.411797 containerd[1475]: time="2025-03-20T22:19:03.411747567Z" level=info msg="connecting to shim 8d3be2afbb2470ff03dcdc20ef04da7dc8467f4861e7c7552b16682040c233b8" address="unix:///run/containerd/s/6b51f6692ac6b7c0ad0d44e25ef559618e57adca98937c5f09c5987a55909202" protocol=ttrpc version=3 Mar 20 22:19:03.455622 systemd[1]: Started cri-containerd-8d3be2afbb2470ff03dcdc20ef04da7dc8467f4861e7c7552b16682040c233b8.scope - libcontainer container 8d3be2afbb2470ff03dcdc20ef04da7dc8467f4861e7c7552b16682040c233b8. Mar 20 22:19:03.616704 containerd[1475]: time="2025-03-20T22:19:03.616608855Z" level=info msg="StartContainer for \"8d3be2afbb2470ff03dcdc20ef04da7dc8467f4861e7c7552b16682040c233b8\" returns successfully" Mar 20 22:19:03.841528 containerd[1475]: time="2025-03-20T22:19:03.841357590Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:03.843808 containerd[1475]: time="2025-03-20T22:19:03.843755258Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 20 22:19:03.846666 containerd[1475]: time="2025-03-20T22:19:03.846408326Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 480.225214ms" Mar 20 22:19:03.846666 containerd[1475]: time="2025-03-20T22:19:03.846470132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 22:19:03.847736 containerd[1475]: time="2025-03-20T22:19:03.847668982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 20 22:19:03.849763 containerd[1475]: time="2025-03-20T22:19:03.849732784Z" level=info msg="CreateContainer within sandbox \"52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 22:19:03.868309 containerd[1475]: time="2025-03-20T22:19:03.867964772Z" level=info msg="Container 0b0bed51ba369528a6108303c9c2853260e0b9219838cb34a9f379998c65d45a: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:03.886983 containerd[1475]: time="2025-03-20T22:19:03.886941699Z" level=info msg="CreateContainer within sandbox \"52d4b084b840e1b073a991f344e00dd365b03efa64c977ff9f3ac3fadddb6360\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0b0bed51ba369528a6108303c9c2853260e0b9219838cb34a9f379998c65d45a\"" Mar 20 22:19:03.888988 containerd[1475]: time="2025-03-20T22:19:03.888873093Z" level=info msg="StartContainer for \"0b0bed51ba369528a6108303c9c2853260e0b9219838cb34a9f379998c65d45a\"" Mar 20 22:19:03.892375 containerd[1475]: time="2025-03-20T22:19:03.892332773Z" level=info msg="connecting to shim 0b0bed51ba369528a6108303c9c2853260e0b9219838cb34a9f379998c65d45a" address="unix:///run/containerd/s/efde4775582442544b2ece4cbc4a819b88406f32b39e8f619731f213dd78225a" protocol=ttrpc version=3 Mar 20 22:19:03.923828 systemd[1]: Started cri-containerd-0b0bed51ba369528a6108303c9c2853260e0b9219838cb34a9f379998c65d45a.scope - libcontainer container 0b0bed51ba369528a6108303c9c2853260e0b9219838cb34a9f379998c65d45a. Mar 20 22:19:03.990124 containerd[1475]: time="2025-03-20T22:19:03.990079996Z" level=info msg="StartContainer for \"0b0bed51ba369528a6108303c9c2853260e0b9219838cb34a9f379998c65d45a\" returns successfully" Mar 20 22:19:04.745350 kubelet[2812]: I0320 22:19:04.742787 2812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 22:19:04.857736 kubelet[2812]: I0320 22:19:04.856648 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cd595b98d-7cqjt" podStartSLOduration=28.299577124 podStartE2EDuration="35.856630314s" podCreationTimestamp="2025-03-20 22:18:29 +0000 UTC" firstStartedPulling="2025-03-20 22:18:55.807612676 +0000 UTC m=+52.864794753" lastFinishedPulling="2025-03-20 22:19:03.364665866 +0000 UTC m=+60.421847943" observedRunningTime="2025-03-20 22:19:03.749909967 +0000 UTC m=+60.807092044" watchObservedRunningTime="2025-03-20 22:19:04.856630314 +0000 UTC m=+61.913812391" Mar 20 22:19:05.651676 kubelet[2812]: I0320 22:19:05.651598 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cd595b98d-spv2p" podStartSLOduration=28.623757642 podStartE2EDuration="36.651554926s" podCreationTimestamp="2025-03-20 22:18:29 +0000 UTC" firstStartedPulling="2025-03-20 22:18:55.819549933 +0000 UTC m=+52.876732021" lastFinishedPulling="2025-03-20 22:19:03.847347218 +0000 UTC m=+60.904529305" observedRunningTime="2025-03-20 22:19:04.857449293 +0000 UTC m=+61.914631420" watchObservedRunningTime="2025-03-20 22:19:05.651554926 +0000 UTC m=+62.708737013" Mar 20 22:19:07.687953 kubelet[2812]: I0320 22:19:07.687869 2812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 22:19:09.069036 containerd[1475]: time="2025-03-20T22:19:09.068976231Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:09.070357 containerd[1475]: time="2025-03-20T22:19:09.070294748Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 20 22:19:09.072313 containerd[1475]: time="2025-03-20T22:19:09.072260241Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:09.076234 containerd[1475]: time="2025-03-20T22:19:09.075956675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:09.076988 containerd[1475]: time="2025-03-20T22:19:09.076439502Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 5.228730355s" Mar 20 22:19:09.076988 containerd[1475]: time="2025-03-20T22:19:09.076499514Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 20 22:19:09.077418 containerd[1475]: time="2025-03-20T22:19:09.077380620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 20 22:19:09.084499 containerd[1475]: time="2025-03-20T22:19:09.084437897Z" level=info msg="CreateContainer within sandbox \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 22:19:09.105672 containerd[1475]: time="2025-03-20T22:19:09.103851025Z" level=info msg="Container 6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:09.110929 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3845649419.mount: Deactivated successfully. Mar 20 22:19:09.132292 containerd[1475]: time="2025-03-20T22:19:09.132186825Z" level=info msg="CreateContainer within sandbox \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\"" Mar 20 22:19:09.133134 containerd[1475]: time="2025-03-20T22:19:09.133116792Z" level=info msg="StartContainer for \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\"" Mar 20 22:19:09.134808 containerd[1475]: time="2025-03-20T22:19:09.134744930Z" level=info msg="connecting to shim 6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38" address="unix:///run/containerd/s/2269505c7237145fb5c37971c4ceebc472adc4558a2861a1081e0e7f770ee395" protocol=ttrpc version=3 Mar 20 22:19:09.167650 systemd[1]: Started cri-containerd-6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38.scope - libcontainer container 6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38. Mar 20 22:19:09.249255 containerd[1475]: time="2025-03-20T22:19:09.249098067Z" level=info msg="StartContainer for \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" returns successfully" Mar 20 22:19:09.799065 kubelet[2812]: I0320 22:19:09.798694 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-59b6b788d9-v7kpt" podStartSLOduration=31.230582179 podStartE2EDuration="41.798653834s" podCreationTimestamp="2025-03-20 22:18:28 +0000 UTC" firstStartedPulling="2025-03-20 22:18:58.509132192 +0000 UTC m=+55.566314269" lastFinishedPulling="2025-03-20 22:19:09.077203847 +0000 UTC m=+66.134385924" observedRunningTime="2025-03-20 22:19:09.796344096 +0000 UTC m=+66.853526223" watchObservedRunningTime="2025-03-20 22:19:09.798653834 +0000 UTC m=+66.855835962" Mar 20 22:19:09.862323 containerd[1475]: time="2025-03-20T22:19:09.862074482Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" id:\"0f544a0a49091d41f9c9a8ac7e5727c6cd6794dfa378ef4ae841215837a7f808\" pid:4807 exited_at:{seconds:1742509149 nanos:860020324}" Mar 20 22:19:11.345925 containerd[1475]: time="2025-03-20T22:19:11.345863221Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:11.347544 containerd[1475]: time="2025-03-20T22:19:11.347497550Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 20 22:19:11.349650 containerd[1475]: time="2025-03-20T22:19:11.349612263Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:11.352353 containerd[1475]: time="2025-03-20T22:19:11.352296795Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:19:11.353000 containerd[1475]: time="2025-03-20T22:19:11.352969518Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.275437054s" Mar 20 22:19:11.353050 containerd[1475]: time="2025-03-20T22:19:11.353000577Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 20 22:19:11.357511 containerd[1475]: time="2025-03-20T22:19:11.357334358Z" level=info msg="CreateContainer within sandbox \"605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 20 22:19:11.368760 containerd[1475]: time="2025-03-20T22:19:11.368721768Z" level=info msg="Container f5828f7d3aad3e32e77935dd497a7dad564c264768da8380096a55a33e6abfca: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:11.377148 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2402805329.mount: Deactivated successfully. Mar 20 22:19:11.389811 containerd[1475]: time="2025-03-20T22:19:11.389745798Z" level=info msg="CreateContainer within sandbox \"605d4793b160704011cf1971270b45d7d63a88ca5d3d17eaf2ca870fe20fd49a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f5828f7d3aad3e32e77935dd497a7dad564c264768da8380096a55a33e6abfca\"" Mar 20 22:19:11.390444 containerd[1475]: time="2025-03-20T22:19:11.390385851Z" level=info msg="StartContainer for \"f5828f7d3aad3e32e77935dd497a7dad564c264768da8380096a55a33e6abfca\"" Mar 20 22:19:11.393187 containerd[1475]: time="2025-03-20T22:19:11.392960156Z" level=info msg="connecting to shim f5828f7d3aad3e32e77935dd497a7dad564c264768da8380096a55a33e6abfca" address="unix:///run/containerd/s/191e1846d104b4f226d7971ea37440b3bac5d3053758eb03f7600ec1ef07da65" protocol=ttrpc version=3 Mar 20 22:19:11.420726 systemd[1]: Started cri-containerd-f5828f7d3aad3e32e77935dd497a7dad564c264768da8380096a55a33e6abfca.scope - libcontainer container f5828f7d3aad3e32e77935dd497a7dad564c264768da8380096a55a33e6abfca. Mar 20 22:19:11.480586 containerd[1475]: time="2025-03-20T22:19:11.480309551Z" level=info msg="StartContainer for \"f5828f7d3aad3e32e77935dd497a7dad564c264768da8380096a55a33e6abfca\" returns successfully" Mar 20 22:19:11.831745 kubelet[2812]: I0320 22:19:11.830409 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-47bzn" podStartSLOduration=28.243881788 podStartE2EDuration="43.830372678s" podCreationTimestamp="2025-03-20 22:18:28 +0000 UTC" firstStartedPulling="2025-03-20 22:18:55.767954701 +0000 UTC m=+52.825136778" lastFinishedPulling="2025-03-20 22:19:11.354445591 +0000 UTC m=+68.411627668" observedRunningTime="2025-03-20 22:19:11.827523737 +0000 UTC m=+68.884705824" watchObservedRunningTime="2025-03-20 22:19:11.830372678 +0000 UTC m=+68.887554816" Mar 20 22:19:12.192263 kubelet[2812]: I0320 22:19:12.192044 2812 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 20 22:19:12.192263 kubelet[2812]: I0320 22:19:12.192104 2812 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 20 22:19:13.734257 containerd[1475]: time="2025-03-20T22:19:13.734202265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" id:\"18f864cd68e6d86770d9461c282d211918c8593be6b396a92aa3ee66ee2cc171\" pid:4867 exited_at:{seconds:1742509153 nanos:733900979}" Mar 20 22:19:19.500315 containerd[1475]: time="2025-03-20T22:19:19.500210284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" id:\"81cdeca9a20ff697796f91d0bfe7bb1ef3164a594426daaed91c168331479397\" pid:4900 exited_at:{seconds:1742509159 nanos:499575411}" Mar 20 22:19:20.284384 containerd[1475]: time="2025-03-20T22:19:20.284345456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" id:\"20d43c7bc91c7c52eba2e552c9f57a1d5cc72a81bafb355a4eb714262ddf5e34\" pid:4926 exited_at:{seconds:1742509160 nanos:283929084}" Mar 20 22:19:24.725109 containerd[1475]: time="2025-03-20T22:19:24.724763572Z" level=info msg="StopContainer for \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" with timeout 300 (s)" Mar 20 22:19:24.725493 containerd[1475]: time="2025-03-20T22:19:24.725173821Z" level=info msg="Stop container \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" with signal terminated" Mar 20 22:19:24.931740 containerd[1475]: time="2025-03-20T22:19:24.931498262Z" level=info msg="StopContainer for \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" with timeout 30 (s)" Mar 20 22:19:24.932237 containerd[1475]: time="2025-03-20T22:19:24.931940853Z" level=info msg="Stop container \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" with signal terminated" Mar 20 22:19:24.951707 systemd[1]: cri-containerd-6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38.scope: Deactivated successfully. Mar 20 22:19:24.956223 containerd[1475]: time="2025-03-20T22:19:24.956173100Z" level=info msg="received exit event container_id:\"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" id:\"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" pid:4775 exit_status:2 exited_at:{seconds:1742509164 nanos:954203862}" Mar 20 22:19:24.956509 containerd[1475]: time="2025-03-20T22:19:24.956456252Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" id:\"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" pid:4775 exit_status:2 exited_at:{seconds:1742509164 nanos:954203862}" Mar 20 22:19:24.999300 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38-rootfs.mount: Deactivated successfully. Mar 20 22:19:25.114146 containerd[1475]: time="2025-03-20T22:19:25.114051546Z" level=info msg="StopContainer for \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" returns successfully" Mar 20 22:19:25.118017 containerd[1475]: time="2025-03-20T22:19:25.117345129Z" level=info msg="StopPodSandbox for \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\"" Mar 20 22:19:25.120367 containerd[1475]: time="2025-03-20T22:19:25.117790035Z" level=info msg="Container to stop \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 22:19:25.145580 systemd[1]: cri-containerd-e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596.scope: Deactivated successfully. Mar 20 22:19:25.156537 containerd[1475]: time="2025-03-20T22:19:25.156381008Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" id:\"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" pid:4567 exit_status:137 exited_at:{seconds:1742509165 nanos:155327340}" Mar 20 22:19:25.162008 containerd[1475]: time="2025-03-20T22:19:25.161945736Z" level=info msg="StopContainer for \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" with timeout 5 (s)" Mar 20 22:19:25.162247 containerd[1475]: time="2025-03-20T22:19:25.162215863Z" level=info msg="Stop container \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" with signal terminated" Mar 20 22:19:25.205573 containerd[1475]: time="2025-03-20T22:19:25.201457619Z" level=info msg="shim disconnected" id=e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596 namespace=k8s.io Mar 20 22:19:25.205573 containerd[1475]: time="2025-03-20T22:19:25.202518310Z" level=warning msg="cleaning up after shim disconnected" id=e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596 namespace=k8s.io Mar 20 22:19:25.205573 containerd[1475]: time="2025-03-20T22:19:25.202793287Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 22:19:25.207678 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596-rootfs.mount: Deactivated successfully. Mar 20 22:19:25.215066 systemd[1]: cri-containerd-47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21.scope: Deactivated successfully. Mar 20 22:19:25.215640 systemd[1]: cri-containerd-47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21.scope: Consumed 2.189s CPU time, 165M memory peak, 652K written to disk. Mar 20 22:19:25.219191 containerd[1475]: time="2025-03-20T22:19:25.219153152Z" level=info msg="received exit event container_id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" pid:3795 exited_at:{seconds:1742509165 nanos:218957204}" Mar 20 22:19:25.239740 containerd[1475]: time="2025-03-20T22:19:25.239683086Z" level=info msg="TaskExit event in podsandbox handler container_id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" id:\"bfd1080b429001979bca94be015be10c2033d76adf8240fa780f1bf0d2929036\" pid:4956 exited_at:{seconds:1742509165 nanos:159595854}" Mar 20 22:19:25.239740 containerd[1475]: time="2025-03-20T22:19:25.239741747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" id:\"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" pid:3795 exited_at:{seconds:1742509165 nanos:218957204}" Mar 20 22:19:25.245422 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596-shm.mount: Deactivated successfully. Mar 20 22:19:25.248984 containerd[1475]: time="2025-03-20T22:19:25.248662656Z" level=info msg="received exit event sandbox_id:\"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" exit_status:137 exited_at:{seconds:1742509165 nanos:155327340}" Mar 20 22:19:25.262507 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21-rootfs.mount: Deactivated successfully. Mar 20 22:19:25.291827 containerd[1475]: time="2025-03-20T22:19:25.291789224Z" level=info msg="StopContainer for \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" returns successfully" Mar 20 22:19:25.292705 containerd[1475]: time="2025-03-20T22:19:25.292590189Z" level=info msg="StopPodSandbox for \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\"" Mar 20 22:19:25.292705 containerd[1475]: time="2025-03-20T22:19:25.292647476Z" level=info msg="Container to stop \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 22:19:25.292705 containerd[1475]: time="2025-03-20T22:19:25.292660661Z" level=info msg="Container to stop \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 22:19:25.292705 containerd[1475]: time="2025-03-20T22:19:25.292674016Z" level=info msg="Container to stop \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 22:19:25.309963 systemd[1]: cri-containerd-86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080.scope: Deactivated successfully. Mar 20 22:19:25.312176 containerd[1475]: time="2025-03-20T22:19:25.312093455Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" id:\"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" pid:3355 exit_status:137 exited_at:{seconds:1742509165 nanos:311503177}" Mar 20 22:19:25.352296 systemd-networkd[1384]: calia5bb964c134: Link DOWN Mar 20 22:19:25.352303 systemd-networkd[1384]: calia5bb964c134: Lost carrier Mar 20 22:19:25.358139 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080-rootfs.mount: Deactivated successfully. Mar 20 22:19:25.360577 containerd[1475]: time="2025-03-20T22:19:25.359728810Z" level=info msg="shim disconnected" id=86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080 namespace=k8s.io Mar 20 22:19:25.360577 containerd[1475]: time="2025-03-20T22:19:25.359763054Z" level=warning msg="cleaning up after shim disconnected" id=86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080 namespace=k8s.io Mar 20 22:19:25.360577 containerd[1475]: time="2025-03-20T22:19:25.359771951Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 22:19:25.387223 containerd[1475]: time="2025-03-20T22:19:25.387044765Z" level=info msg="received exit event sandbox_id:\"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" exit_status:137 exited_at:{seconds:1742509165 nanos:311503177}" Mar 20 22:19:25.387389 containerd[1475]: time="2025-03-20T22:19:25.387355358Z" level=info msg="TearDown network for sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" successfully" Mar 20 22:19:25.387423 containerd[1475]: time="2025-03-20T22:19:25.387387649Z" level=info msg="StopPodSandbox for \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" returns successfully" Mar 20 22:19:25.460052 kubelet[2812]: I0320 22:19:25.460006 2812 topology_manager.go:215] "Topology Admit Handler" podUID="84cbef51-fb8f-4daa-9532-29ea81ff6b9c" podNamespace="calico-system" podName="calico-node-tpslj" Mar 20 22:19:25.460414 kubelet[2812]: E0320 22:19:25.460071 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6642f59a-b8ee-4d5e-99d5-00c40add0c6d" containerName="flexvol-driver" Mar 20 22:19:25.460414 kubelet[2812]: E0320 22:19:25.460083 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6642f59a-b8ee-4d5e-99d5-00c40add0c6d" containerName="calico-node" Mar 20 22:19:25.460414 kubelet[2812]: E0320 22:19:25.460091 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6642f59a-b8ee-4d5e-99d5-00c40add0c6d" containerName="install-cni" Mar 20 22:19:25.460414 kubelet[2812]: I0320 22:19:25.460119 2812 memory_manager.go:354] "RemoveStaleState removing state" podUID="6642f59a-b8ee-4d5e-99d5-00c40add0c6d" containerName="calico-node" Mar 20 22:19:25.480938 systemd[1]: Created slice kubepods-besteffort-pod84cbef51_fb8f_4daa_9532_29ea81ff6b9c.slice - libcontainer container kubepods-besteffort-pod84cbef51_fb8f_4daa_9532_29ea81ff6b9c.slice. Mar 20 22:19:25.500890 kubelet[2812]: I0320 22:19:25.499598 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-flexvol-driver-host\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.500890 kubelet[2812]: I0320 22:19:25.499641 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-lib-modules\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.500890 kubelet[2812]: I0320 22:19:25.499661 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-xtables-lock\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.500890 kubelet[2812]: I0320 22:19:25.499687 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpw22\" (UniqueName: \"kubernetes.io/projected/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-kube-api-access-xpw22\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.500890 kubelet[2812]: I0320 22:19:25.499705 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-net-dir\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.500890 kubelet[2812]: I0320 22:19:25.499724 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-lib-calico\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.501154 kubelet[2812]: I0320 22:19:25.499744 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-tigera-ca-bundle\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.501154 kubelet[2812]: I0320 22:19:25.499724 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.501154 kubelet[2812]: I0320 22:19:25.499760 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-bin-dir\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.501154 kubelet[2812]: I0320 22:19:25.499802 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.501154 kubelet[2812]: I0320 22:19:25.499830 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.501293 kubelet[2812]: I0320 22:19:25.499849 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.501293 kubelet[2812]: I0320 22:19:25.499851 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-node-certs\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.501293 kubelet[2812]: I0320 22:19:25.499877 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-run-calico\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.501293 kubelet[2812]: I0320 22:19:25.499895 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-policysync\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.501293 kubelet[2812]: I0320 22:19:25.499932 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-log-dir\") pod \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\" (UID: \"6642f59a-b8ee-4d5e-99d5-00c40add0c6d\") " Mar 20 22:19:25.501293 kubelet[2812]: I0320 22:19:25.500015 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-policysync\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501453 kubelet[2812]: I0320 22:19:25.500041 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-flexvol-driver-host\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501453 kubelet[2812]: I0320 22:19:25.500065 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-tigera-ca-bundle\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501453 kubelet[2812]: I0320 22:19:25.500115 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-cni-bin-dir\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501453 kubelet[2812]: I0320 22:19:25.500149 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-cni-net-dir\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501453 kubelet[2812]: I0320 22:19:25.500210 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbt4\" (UniqueName: \"kubernetes.io/projected/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-kube-api-access-8vbt4\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501611 kubelet[2812]: I0320 22:19:25.500236 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-xtables-lock\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501611 kubelet[2812]: I0320 22:19:25.500278 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-node-certs\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501611 kubelet[2812]: I0320 22:19:25.500304 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-cni-log-dir\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501611 kubelet[2812]: I0320 22:19:25.500325 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-var-lib-calico\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501611 kubelet[2812]: I0320 22:19:25.500363 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-lib-modules\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501750 kubelet[2812]: I0320 22:19:25.500385 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/84cbef51-fb8f-4daa-9532-29ea81ff6b9c-var-run-calico\") pod \"calico-node-tpslj\" (UID: \"84cbef51-fb8f-4daa-9532-29ea81ff6b9c\") " pod="calico-system/calico-node-tpslj" Mar 20 22:19:25.501750 kubelet[2812]: I0320 22:19:25.500413 2812 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-flexvol-driver-host\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.501750 kubelet[2812]: I0320 22:19:25.500443 2812 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-lib-modules\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.501750 kubelet[2812]: I0320 22:19:25.500457 2812 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-xtables-lock\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.501750 kubelet[2812]: I0320 22:19:25.501220 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.501750 kubelet[2812]: I0320 22:19:25.501248 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-policysync" (OuterVolumeSpecName: "policysync") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.501967 kubelet[2812]: I0320 22:19:25.501510 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.503204 kubelet[2812]: I0320 22:19:25.502751 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.503204 kubelet[2812]: I0320 22:19:25.502793 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 22:19:25.510336 kubelet[2812]: I0320 22:19:25.510135 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-kube-api-access-xpw22" (OuterVolumeSpecName: "kube-api-access-xpw22") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "kube-api-access-xpw22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 22:19:25.513563 kubelet[2812]: I0320 22:19:25.512928 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-node-certs" (OuterVolumeSpecName: "node-certs") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 22:19:25.517754 kubelet[2812]: I0320 22:19:25.517258 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "6642f59a-b8ee-4d5e-99d5-00c40add0c6d" (UID: "6642f59a-b8ee-4d5e-99d5-00c40add0c6d"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.351 [INFO][5056] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.351 [INFO][5056] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" iface="eth0" netns="/var/run/netns/cni-e32b1a54-2d61-fec9-cdb0-60934c8daf62" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.351 [INFO][5056] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" iface="eth0" netns="/var/run/netns/cni-e32b1a54-2d61-fec9-cdb0-60934c8daf62" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.368 [INFO][5056] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" after=17.163134ms iface="eth0" netns="/var/run/netns/cni-e32b1a54-2d61-fec9-cdb0-60934c8daf62" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.368 [INFO][5056] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.368 [INFO][5056] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.417 [INFO][5106] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.418 [INFO][5106] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.418 [INFO][5106] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.502 [INFO][5106] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.502 [INFO][5106] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.510 [INFO][5106] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:19:25.518043 containerd[1475]: 2025-03-20 22:19:25.515 [INFO][5056] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:19:25.518777 containerd[1475]: time="2025-03-20T22:19:25.518250190Z" level=info msg="TearDown network for sandbox \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" successfully" Mar 20 22:19:25.518777 containerd[1475]: time="2025-03-20T22:19:25.518274375Z" level=info msg="StopPodSandbox for \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" returns successfully" Mar 20 22:19:25.601729 kubelet[2812]: I0320 22:19:25.601640 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fee3832f-4ccf-429d-9c96-987d3f4ad55c-tigera-ca-bundle\") pod \"fee3832f-4ccf-429d-9c96-987d3f4ad55c\" (UID: \"fee3832f-4ccf-429d-9c96-987d3f4ad55c\") " Mar 20 22:19:25.603589 kubelet[2812]: I0320 22:19:25.601818 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9sjv\" (UniqueName: \"kubernetes.io/projected/fee3832f-4ccf-429d-9c96-987d3f4ad55c-kube-api-access-m9sjv\") pod \"fee3832f-4ccf-429d-9c96-987d3f4ad55c\" (UID: \"fee3832f-4ccf-429d-9c96-987d3f4ad55c\") " Mar 20 22:19:25.603589 kubelet[2812]: I0320 22:19:25.602014 2812 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-node-certs\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603589 kubelet[2812]: I0320 22:19:25.602028 2812 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-run-calico\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603589 kubelet[2812]: I0320 22:19:25.602101 2812 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-log-dir\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603589 kubelet[2812]: I0320 22:19:25.602149 2812 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-net-dir\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603589 kubelet[2812]: I0320 22:19:25.602161 2812 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-xpw22\" (UniqueName: \"kubernetes.io/projected/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-kube-api-access-xpw22\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603589 kubelet[2812]: I0320 22:19:25.602172 2812 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-tigera-ca-bundle\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603773 kubelet[2812]: I0320 22:19:25.602183 2812 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-cni-bin-dir\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603773 kubelet[2812]: I0320 22:19:25.602193 2812 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-var-lib-calico\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.603773 kubelet[2812]: I0320 22:19:25.602203 2812 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6642f59a-b8ee-4d5e-99d5-00c40add0c6d-policysync\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.609465 kubelet[2812]: I0320 22:19:25.609387 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee3832f-4ccf-429d-9c96-987d3f4ad55c-kube-api-access-m9sjv" (OuterVolumeSpecName: "kube-api-access-m9sjv") pod "fee3832f-4ccf-429d-9c96-987d3f4ad55c" (UID: "fee3832f-4ccf-429d-9c96-987d3f4ad55c"). InnerVolumeSpecName "kube-api-access-m9sjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 22:19:25.610356 kubelet[2812]: I0320 22:19:25.610325 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee3832f-4ccf-429d-9c96-987d3f4ad55c-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "fee3832f-4ccf-429d-9c96-987d3f4ad55c" (UID: "fee3832f-4ccf-429d-9c96-987d3f4ad55c"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 22:19:25.703590 kubelet[2812]: I0320 22:19:25.703503 2812 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fee3832f-4ccf-429d-9c96-987d3f4ad55c-tigera-ca-bundle\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.703590 kubelet[2812]: I0320 22:19:25.703567 2812 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-m9sjv\" (UniqueName: \"kubernetes.io/projected/fee3832f-4ccf-429d-9c96-987d3f4ad55c-kube-api-access-m9sjv\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:25.789400 containerd[1475]: time="2025-03-20T22:19:25.789253013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tpslj,Uid:84cbef51-fb8f-4daa-9532-29ea81ff6b9c,Namespace:calico-system,Attempt:0,}" Mar 20 22:19:25.827469 containerd[1475]: time="2025-03-20T22:19:25.827010111Z" level=info msg="connecting to shim bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657" address="unix:///run/containerd/s/f81306efc49046a9110bb6d5e09fd3cc203a49b001d88ac493e3fcac36693196" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:19:25.849513 kubelet[2812]: I0320 22:19:25.848467 2812 scope.go:117] "RemoveContainer" containerID="47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21" Mar 20 22:19:25.857231 systemd[1]: Removed slice kubepods-besteffort-pod6642f59a_b8ee_4d5e_99d5_00c40add0c6d.slice - libcontainer container kubepods-besteffort-pod6642f59a_b8ee_4d5e_99d5_00c40add0c6d.slice. Mar 20 22:19:25.857607 systemd[1]: kubepods-besteffort-pod6642f59a_b8ee_4d5e_99d5_00c40add0c6d.slice: Consumed 2.926s CPU time, 292.3M memory peak, 161M written to disk. Mar 20 22:19:25.869688 systemd[1]: Started cri-containerd-bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657.scope - libcontainer container bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657. Mar 20 22:19:25.872703 systemd[1]: Removed slice kubepods-besteffort-podfee3832f_4ccf_429d_9c96_987d3f4ad55c.slice - libcontainer container kubepods-besteffort-podfee3832f_4ccf_429d_9c96_987d3f4ad55c.slice. Mar 20 22:19:25.876193 containerd[1475]: time="2025-03-20T22:19:25.875158087Z" level=info msg="RemoveContainer for \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\"" Mar 20 22:19:25.885298 containerd[1475]: time="2025-03-20T22:19:25.885232191Z" level=info msg="RemoveContainer for \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" returns successfully" Mar 20 22:19:25.885860 kubelet[2812]: I0320 22:19:25.885704 2812 scope.go:117] "RemoveContainer" containerID="cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927" Mar 20 22:19:25.891671 containerd[1475]: time="2025-03-20T22:19:25.891636245Z" level=info msg="RemoveContainer for \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\"" Mar 20 22:19:25.901857 containerd[1475]: time="2025-03-20T22:19:25.901750504Z" level=info msg="RemoveContainer for \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" returns successfully" Mar 20 22:19:25.902284 kubelet[2812]: I0320 22:19:25.902185 2812 scope.go:117] "RemoveContainer" containerID="a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1" Mar 20 22:19:25.911302 containerd[1475]: time="2025-03-20T22:19:25.911257554Z" level=info msg="RemoveContainer for \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\"" Mar 20 22:19:25.922080 containerd[1475]: time="2025-03-20T22:19:25.922044305Z" level=info msg="RemoveContainer for \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" returns successfully" Mar 20 22:19:25.922600 kubelet[2812]: I0320 22:19:25.922532 2812 scope.go:117] "RemoveContainer" containerID="47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21" Mar 20 22:19:25.924500 containerd[1475]: time="2025-03-20T22:19:25.923084598Z" level=error msg="ContainerStatus for \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\": not found" Mar 20 22:19:25.924842 kubelet[2812]: E0320 22:19:25.924643 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\": not found" containerID="47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21" Mar 20 22:19:25.924842 kubelet[2812]: I0320 22:19:25.924677 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21"} err="failed to get container status \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\": rpc error: code = NotFound desc = an error occurred when try to find container \"47e8662b0e9de8b3a6ae4ab3fce5c210d07268cdbb2c4033d18e70a160db4d21\": not found" Mar 20 22:19:25.924842 kubelet[2812]: I0320 22:19:25.924771 2812 scope.go:117] "RemoveContainer" containerID="cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927" Mar 20 22:19:25.925262 containerd[1475]: time="2025-03-20T22:19:25.925186074Z" level=error msg="ContainerStatus for \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\": not found" Mar 20 22:19:25.925684 kubelet[2812]: E0320 22:19:25.925464 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\": not found" containerID="cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927" Mar 20 22:19:25.925684 kubelet[2812]: I0320 22:19:25.925622 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927"} err="failed to get container status \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\": rpc error: code = NotFound desc = an error occurred when try to find container \"cd3700625c7d9dcfeb8fabb37921874cb21b19e577301e9b9adbe88faea30927\": not found" Mar 20 22:19:25.925684 kubelet[2812]: I0320 22:19:25.925640 2812 scope.go:117] "RemoveContainer" containerID="a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1" Mar 20 22:19:25.926486 containerd[1475]: time="2025-03-20T22:19:25.926280539Z" level=error msg="ContainerStatus for \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\": not found" Mar 20 22:19:25.926720 kubelet[2812]: E0320 22:19:25.926644 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\": not found" containerID="a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1" Mar 20 22:19:25.926720 kubelet[2812]: I0320 22:19:25.926667 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1"} err="failed to get container status \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\": rpc error: code = NotFound desc = an error occurred when try to find container \"a93c5354e58e9174607815f1fcb789e98185e47d66511f53eb00c2a15070c6f1\": not found" Mar 20 22:19:25.926720 kubelet[2812]: I0320 22:19:25.926682 2812 scope.go:117] "RemoveContainer" containerID="6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38" Mar 20 22:19:25.929426 containerd[1475]: time="2025-03-20T22:19:25.929403703Z" level=info msg="RemoveContainer for \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\"" Mar 20 22:19:25.937459 containerd[1475]: time="2025-03-20T22:19:25.936731380Z" level=info msg="RemoveContainer for \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" returns successfully" Mar 20 22:19:25.938248 kubelet[2812]: I0320 22:19:25.937999 2812 scope.go:117] "RemoveContainer" containerID="6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38" Mar 20 22:19:25.938998 containerd[1475]: time="2025-03-20T22:19:25.938769246Z" level=error msg="ContainerStatus for \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\": not found" Mar 20 22:19:25.939379 kubelet[2812]: E0320 22:19:25.939144 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\": not found" containerID="6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38" Mar 20 22:19:25.939379 kubelet[2812]: I0320 22:19:25.939355 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38"} err="failed to get container status \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\": rpc error: code = NotFound desc = an error occurred when try to find container \"6ef538c1b325cd5d5d0451385bb3ff48df42e1f0a90d871280fa5f3a4a977b38\": not found" Mar 20 22:19:25.944503 containerd[1475]: time="2025-03-20T22:19:25.943755017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tpslj,Uid:84cbef51-fb8f-4daa-9532-29ea81ff6b9c,Namespace:calico-system,Attempt:0,} returns sandbox id \"bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657\"" Mar 20 22:19:25.949513 containerd[1475]: time="2025-03-20T22:19:25.949406066Z" level=info msg="CreateContainer within sandbox \"bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 22:19:25.960425 containerd[1475]: time="2025-03-20T22:19:25.960390349Z" level=info msg="Container 558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:25.973031 systemd[1]: cri-containerd-573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b.scope: Deactivated successfully. Mar 20 22:19:25.977863 containerd[1475]: time="2025-03-20T22:19:25.977816697Z" level=info msg="CreateContainer within sandbox \"bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850\"" Mar 20 22:19:25.981552 containerd[1475]: time="2025-03-20T22:19:25.979144830Z" level=info msg="StartContainer for \"558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850\"" Mar 20 22:19:25.981552 containerd[1475]: time="2025-03-20T22:19:25.979784442Z" level=info msg="received exit event container_id:\"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" id:\"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" pid:3388 exit_status:1 exited_at:{seconds:1742509165 nanos:978865486}" Mar 20 22:19:25.981552 containerd[1475]: time="2025-03-20T22:19:25.980055720Z" level=info msg="TaskExit event in podsandbox handler container_id:\"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" id:\"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" pid:3388 exit_status:1 exited_at:{seconds:1742509165 nanos:978865486}" Mar 20 22:19:25.982077 containerd[1475]: time="2025-03-20T22:19:25.981980064Z" level=info msg="connecting to shim 558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850" address="unix:///run/containerd/s/f81306efc49046a9110bb6d5e09fd3cc203a49b001d88ac493e3fcac36693196" protocol=ttrpc version=3 Mar 20 22:19:26.005854 systemd[1]: var-lib-kubelet-pods-fee3832f\x2d4ccf\x2d429d\x2d9c96\x2d987d3f4ad55c-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 20 22:19:26.005991 systemd[1]: run-netns-cni\x2de32b1a54\x2d2d61\x2dfec9\x2dcdb0\x2d60934c8daf62.mount: Deactivated successfully. Mar 20 22:19:26.006074 systemd[1]: var-lib-kubelet-pods-6642f59a\x2db8ee\x2d4d5e\x2d99d5\x2d00c40add0c6d-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 20 22:19:26.006159 systemd[1]: var-lib-kubelet-pods-fee3832f\x2d4ccf\x2d429d\x2d9c96\x2d987d3f4ad55c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dm9sjv.mount: Deactivated successfully. Mar 20 22:19:26.006237 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080-shm.mount: Deactivated successfully. Mar 20 22:19:26.006310 systemd[1]: var-lib-kubelet-pods-6642f59a\x2db8ee\x2d4d5e\x2d99d5\x2d00c40add0c6d-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dxpw22.mount: Deactivated successfully. Mar 20 22:19:26.006382 systemd[1]: var-lib-kubelet-pods-6642f59a\x2db8ee\x2d4d5e\x2d99d5\x2d00c40add0c6d-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 20 22:19:26.041430 systemd[1]: Started cri-containerd-558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850.scope - libcontainer container 558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850. Mar 20 22:19:26.056180 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b-rootfs.mount: Deactivated successfully. Mar 20 22:19:26.084923 containerd[1475]: time="2025-03-20T22:19:26.084498696Z" level=info msg="StopContainer for \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" returns successfully" Mar 20 22:19:26.085245 containerd[1475]: time="2025-03-20T22:19:26.085185516Z" level=info msg="StopPodSandbox for \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\"" Mar 20 22:19:26.085321 containerd[1475]: time="2025-03-20T22:19:26.085290483Z" level=info msg="Container to stop \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 22:19:26.101556 systemd[1]: cri-containerd-870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1.scope: Deactivated successfully. Mar 20 22:19:26.105889 containerd[1475]: time="2025-03-20T22:19:26.105597708Z" level=info msg="TaskExit event in podsandbox handler container_id:\"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" id:\"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" pid:3246 exit_status:137 exited_at:{seconds:1742509166 nanos:105002310}" Mar 20 22:19:26.136719 containerd[1475]: time="2025-03-20T22:19:26.136673671Z" level=info msg="StartContainer for \"558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850\" returns successfully" Mar 20 22:19:26.155014 systemd[1]: cri-containerd-558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850.scope: Deactivated successfully. Mar 20 22:19:26.155644 systemd[1]: cri-containerd-558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850.scope: Consumed 36ms CPU time, 7.9M memory peak, 6.3M written to disk. Mar 20 22:19:26.162257 containerd[1475]: time="2025-03-20T22:19:26.162119734Z" level=info msg="received exit event container_id:\"558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850\" id:\"558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850\" pid:5194 exited_at:{seconds:1742509166 nanos:161459174}" Mar 20 22:19:26.167537 containerd[1475]: time="2025-03-20T22:19:26.167503081Z" level=info msg="shim disconnected" id=870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1 namespace=k8s.io Mar 20 22:19:26.167988 containerd[1475]: time="2025-03-20T22:19:26.167940683Z" level=warning msg="cleaning up after shim disconnected" id=870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1 namespace=k8s.io Mar 20 22:19:26.167988 containerd[1475]: time="2025-03-20T22:19:26.167959689Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 22:19:26.168397 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1-rootfs.mount: Deactivated successfully. Mar 20 22:19:26.196987 containerd[1475]: time="2025-03-20T22:19:26.196939927Z" level=info msg="received exit event sandbox_id:\"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" exit_status:137 exited_at:{seconds:1742509166 nanos:105002310}" Mar 20 22:19:26.200491 containerd[1475]: time="2025-03-20T22:19:26.198292025Z" level=info msg="TaskExit event in podsandbox handler container_id:\"558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850\" id:\"558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850\" pid:5194 exited_at:{seconds:1742509166 nanos:161459174}" Mar 20 22:19:26.200737 containerd[1475]: time="2025-03-20T22:19:26.200715946Z" level=info msg="TearDown network for sandbox \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" successfully" Mar 20 22:19:26.200809 containerd[1475]: time="2025-03-20T22:19:26.200793822Z" level=info msg="StopPodSandbox for \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" returns successfully" Mar 20 22:19:26.201959 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1-shm.mount: Deactivated successfully. Mar 20 22:19:26.211610 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-558f8aecc1764dbb3cecc1866a03e9a4f978ebcafdfac009e352c9c1f08a4850-rootfs.mount: Deactivated successfully. Mar 20 22:19:26.313241 kubelet[2812]: I0320 22:19:26.312747 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-tigera-ca-bundle\") pod \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\" (UID: \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\") " Mar 20 22:19:26.313241 kubelet[2812]: I0320 22:19:26.312789 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-typha-certs\") pod \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\" (UID: \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\") " Mar 20 22:19:26.313241 kubelet[2812]: I0320 22:19:26.312822 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd742\" (UniqueName: \"kubernetes.io/projected/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-kube-api-access-gd742\") pod \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\" (UID: \"88ff7e75-e8c8-4ce9-b22e-d97a76a56977\") " Mar 20 22:19:26.315840 kubelet[2812]: I0320 22:19:26.315767 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-kube-api-access-gd742" (OuterVolumeSpecName: "kube-api-access-gd742") pod "88ff7e75-e8c8-4ce9-b22e-d97a76a56977" (UID: "88ff7e75-e8c8-4ce9-b22e-d97a76a56977"). InnerVolumeSpecName "kube-api-access-gd742". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 22:19:26.319566 kubelet[2812]: I0320 22:19:26.318462 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "88ff7e75-e8c8-4ce9-b22e-d97a76a56977" (UID: "88ff7e75-e8c8-4ce9-b22e-d97a76a56977"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 22:19:26.320241 kubelet[2812]: I0320 22:19:26.320142 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "88ff7e75-e8c8-4ce9-b22e-d97a76a56977" (UID: "88ff7e75-e8c8-4ce9-b22e-d97a76a56977"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 22:19:26.413611 kubelet[2812]: I0320 22:19:26.413565 2812 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-gd742\" (UniqueName: \"kubernetes.io/projected/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-kube-api-access-gd742\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:26.413891 kubelet[2812]: I0320 22:19:26.413811 2812 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-tigera-ca-bundle\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:26.413891 kubelet[2812]: I0320 22:19:26.413829 2812 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/88ff7e75-e8c8-4ce9-b22e-d97a76a56977-typha-certs\") on node \"ci-9999-0-2-b-c50fddf147.novalocal\" DevicePath \"\"" Mar 20 22:19:26.867701 kubelet[2812]: I0320 22:19:26.867204 2812 scope.go:117] "RemoveContainer" containerID="573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b" Mar 20 22:19:26.874327 containerd[1475]: time="2025-03-20T22:19:26.872198186Z" level=info msg="CreateContainer within sandbox \"bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 22:19:26.874327 containerd[1475]: time="2025-03-20T22:19:26.873496564Z" level=info msg="RemoveContainer for \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\"" Mar 20 22:19:26.874106 systemd[1]: Removed slice kubepods-besteffort-pod88ff7e75_e8c8_4ce9_b22e_d97a76a56977.slice - libcontainer container kubepods-besteffort-pod88ff7e75_e8c8_4ce9_b22e_d97a76a56977.slice. Mar 20 22:19:26.883899 containerd[1475]: time="2025-03-20T22:19:26.883865822Z" level=info msg="RemoveContainer for \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" returns successfully" Mar 20 22:19:26.884332 kubelet[2812]: I0320 22:19:26.884217 2812 scope.go:117] "RemoveContainer" containerID="573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b" Mar 20 22:19:26.884756 containerd[1475]: time="2025-03-20T22:19:26.884723232Z" level=error msg="ContainerStatus for \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\": not found" Mar 20 22:19:26.885040 kubelet[2812]: E0320 22:19:26.884987 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\": not found" containerID="573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b" Mar 20 22:19:26.885040 kubelet[2812]: I0320 22:19:26.885015 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b"} err="failed to get container status \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\": rpc error: code = NotFound desc = an error occurred when try to find container \"573f6114c311d063a0021015b8d2643ef55abce69b93141b6fd5d7731f92894b\": not found" Mar 20 22:19:26.891677 containerd[1475]: time="2025-03-20T22:19:26.889500501Z" level=info msg="Container 9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:26.911982 containerd[1475]: time="2025-03-20T22:19:26.911921805Z" level=info msg="CreateContainer within sandbox \"bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf\"" Mar 20 22:19:26.912570 containerd[1475]: time="2025-03-20T22:19:26.912541418Z" level=info msg="StartContainer for \"9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf\"" Mar 20 22:19:26.914451 containerd[1475]: time="2025-03-20T22:19:26.914397513Z" level=info msg="connecting to shim 9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf" address="unix:///run/containerd/s/f81306efc49046a9110bb6d5e09fd3cc203a49b001d88ac493e3fcac36693196" protocol=ttrpc version=3 Mar 20 22:19:26.938011 systemd[1]: Started cri-containerd-9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf.scope - libcontainer container 9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf. Mar 20 22:19:27.000237 systemd[1]: var-lib-kubelet-pods-88ff7e75\x2de8c8\x2d4ce9\x2db22e\x2dd97a76a56977-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 20 22:19:27.000357 systemd[1]: var-lib-kubelet-pods-88ff7e75\x2de8c8\x2d4ce9\x2db22e\x2dd97a76a56977-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dgd742.mount: Deactivated successfully. Mar 20 22:19:27.000433 systemd[1]: var-lib-kubelet-pods-88ff7e75\x2de8c8\x2d4ce9\x2db22e\x2dd97a76a56977-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 20 22:19:27.012160 containerd[1475]: time="2025-03-20T22:19:27.012121121Z" level=info msg="StartContainer for \"9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf\" returns successfully" Mar 20 22:19:27.065304 kubelet[2812]: I0320 22:19:27.065165 2812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6642f59a-b8ee-4d5e-99d5-00c40add0c6d" path="/var/lib/kubelet/pods/6642f59a-b8ee-4d5e-99d5-00c40add0c6d/volumes" Mar 20 22:19:27.066219 kubelet[2812]: I0320 22:19:27.066197 2812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ff7e75-e8c8-4ce9-b22e-d97a76a56977" path="/var/lib/kubelet/pods/88ff7e75-e8c8-4ce9-b22e-d97a76a56977/volumes" Mar 20 22:19:27.066625 kubelet[2812]: I0320 22:19:27.066607 2812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee3832f-4ccf-429d-9c96-987d3f4ad55c" path="/var/lib/kubelet/pods/fee3832f-4ccf-429d-9c96-987d3f4ad55c/volumes" Mar 20 22:19:27.744066 systemd[1]: cri-containerd-9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf.scope: Deactivated successfully. Mar 20 22:19:27.744317 systemd[1]: cri-containerd-9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf.scope: Consumed 681ms CPU time, 57.6M memory peak, 34.4M read from disk. Mar 20 22:19:27.746992 containerd[1475]: time="2025-03-20T22:19:27.746644131Z" level=info msg="received exit event container_id:\"9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf\" id:\"9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf\" pid:5288 exited_at:{seconds:1742509167 nanos:745823430}" Mar 20 22:19:27.746992 containerd[1475]: time="2025-03-20T22:19:27.746804662Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf\" id:\"9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf\" pid:5288 exited_at:{seconds:1742509167 nanos:745823430}" Mar 20 22:19:27.773679 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9a7558c37349648f1ff4fc858d2c4e22f6aa7a74a5079e5c623f2f77bd6e42cf-rootfs.mount: Deactivated successfully. Mar 20 22:19:27.908995 containerd[1475]: time="2025-03-20T22:19:27.907602030Z" level=info msg="CreateContainer within sandbox \"bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 22:19:27.935737 containerd[1475]: time="2025-03-20T22:19:27.932809704Z" level=info msg="Container d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:27.962674 containerd[1475]: time="2025-03-20T22:19:27.962505585Z" level=info msg="CreateContainer within sandbox \"bbc95f0abc018c777a26ef0938bfed540687e69c3479a9765cdcfbca84934657\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\"" Mar 20 22:19:27.964458 containerd[1475]: time="2025-03-20T22:19:27.964424157Z" level=info msg="StartContainer for \"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\"" Mar 20 22:19:27.966379 containerd[1475]: time="2025-03-20T22:19:27.966352127Z" level=info msg="connecting to shim d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed" address="unix:///run/containerd/s/f81306efc49046a9110bb6d5e09fd3cc203a49b001d88ac493e3fcac36693196" protocol=ttrpc version=3 Mar 20 22:19:27.988640 systemd[1]: Started cri-containerd-d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed.scope - libcontainer container d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed. Mar 20 22:19:28.057579 containerd[1475]: time="2025-03-20T22:19:28.057251296Z" level=info msg="StartContainer for \"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\" returns successfully" Mar 20 22:19:28.204434 kubelet[2812]: I0320 22:19:28.203719 2812 topology_manager.go:215] "Topology Admit Handler" podUID="78fd55e7-a426-4517-9f4e-3ebf38e4734d" podNamespace="calico-system" podName="calico-kube-controllers-d96b75bf6-cdxnt" Mar 20 22:19:28.205751 kubelet[2812]: E0320 22:19:28.205568 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="88ff7e75-e8c8-4ce9-b22e-d97a76a56977" containerName="calico-typha" Mar 20 22:19:28.205751 kubelet[2812]: E0320 22:19:28.205609 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="fee3832f-4ccf-429d-9c96-987d3f4ad55c" containerName="calico-kube-controllers" Mar 20 22:19:28.205751 kubelet[2812]: I0320 22:19:28.205643 2812 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ff7e75-e8c8-4ce9-b22e-d97a76a56977" containerName="calico-typha" Mar 20 22:19:28.205751 kubelet[2812]: I0320 22:19:28.205651 2812 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee3832f-4ccf-429d-9c96-987d3f4ad55c" containerName="calico-kube-controllers" Mar 20 22:19:28.216418 systemd[1]: Created slice kubepods-besteffort-pod78fd55e7_a426_4517_9f4e_3ebf38e4734d.slice - libcontainer container kubepods-besteffort-pod78fd55e7_a426_4517_9f4e_3ebf38e4734d.slice. Mar 20 22:19:28.227039 kubelet[2812]: I0320 22:19:28.227002 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqk4\" (UniqueName: \"kubernetes.io/projected/78fd55e7-a426-4517-9f4e-3ebf38e4734d-kube-api-access-frqk4\") pod \"calico-kube-controllers-d96b75bf6-cdxnt\" (UID: \"78fd55e7-a426-4517-9f4e-3ebf38e4734d\") " pod="calico-system/calico-kube-controllers-d96b75bf6-cdxnt" Mar 20 22:19:28.227196 kubelet[2812]: I0320 22:19:28.227181 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78fd55e7-a426-4517-9f4e-3ebf38e4734d-tigera-ca-bundle\") pod \"calico-kube-controllers-d96b75bf6-cdxnt\" (UID: \"78fd55e7-a426-4517-9f4e-3ebf38e4734d\") " pod="calico-system/calico-kube-controllers-d96b75bf6-cdxnt" Mar 20 22:19:28.522740 containerd[1475]: time="2025-03-20T22:19:28.522668079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d96b75bf6-cdxnt,Uid:78fd55e7-a426-4517-9f4e-3ebf38e4734d,Namespace:calico-system,Attempt:0,}" Mar 20 22:19:28.679989 systemd-networkd[1384]: calieb83e0be233: Link UP Mar 20 22:19:28.680195 systemd-networkd[1384]: calieb83e0be233: Gained carrier Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.599 [INFO][5374] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0 calico-kube-controllers-d96b75bf6- calico-system 78fd55e7-a426-4517-9f4e-3ebf38e4734d 1056 0 2025-03-20 22:19:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:d96b75bf6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-0-2-b-c50fddf147.novalocal calico-kube-controllers-d96b75bf6-cdxnt eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calieb83e0be233 [] []}} ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.599 [INFO][5374] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.635 [INFO][5387] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" HandleID="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.647 [INFO][5387] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" HandleID="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031acb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-2-b-c50fddf147.novalocal", "pod":"calico-kube-controllers-d96b75bf6-cdxnt", "timestamp":"2025-03-20 22:19:28.635008029 +0000 UTC"}, Hostname:"ci-9999-0-2-b-c50fddf147.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.648 [INFO][5387] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.648 [INFO][5387] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.648 [INFO][5387] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-b-c50fddf147.novalocal' Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.650 [INFO][5387] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.655 [INFO][5387] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.659 [INFO][5387] ipam/ipam.go 489: Trying affinity for 192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.661 [INFO][5387] ipam/ipam.go 155: Attempting to load block cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.663 [INFO][5387] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.50.64/26 host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.663 [INFO][5387] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.50.64/26 handle="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.665 [INFO][5387] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4 Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.669 [INFO][5387] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.50.64/26 handle="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.676 [INFO][5387] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.50.71/26] block=192.168.50.64/26 handle="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.676 [INFO][5387] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.50.71/26] handle="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" host="ci-9999-0-2-b-c50fddf147.novalocal" Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.676 [INFO][5387] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:19:28.698095 containerd[1475]: 2025-03-20 22:19:28.676 [INFO][5387] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.71/26] IPv6=[] ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" HandleID="k8s-pod-network.02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" Mar 20 22:19:28.699844 containerd[1475]: 2025-03-20 22:19:28.677 [INFO][5374] cni-plugin/k8s.go 386: Populated endpoint ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0", GenerateName:"calico-kube-controllers-d96b75bf6-", Namespace:"calico-system", SelfLink:"", UID:"78fd55e7-a426-4517-9f4e-3ebf38e4734d", ResourceVersion:"1056", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 19, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d96b75bf6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"", Pod:"calico-kube-controllers-d96b75bf6-cdxnt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.50.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calieb83e0be233", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:19:28.699844 containerd[1475]: 2025-03-20 22:19:28.678 [INFO][5374] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.50.71/32] ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" Mar 20 22:19:28.699844 containerd[1475]: 2025-03-20 22:19:28.678 [INFO][5374] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb83e0be233 ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" Mar 20 22:19:28.699844 containerd[1475]: 2025-03-20 22:19:28.680 [INFO][5374] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" Mar 20 22:19:28.699844 containerd[1475]: 2025-03-20 22:19:28.680 [INFO][5374] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0", GenerateName:"calico-kube-controllers-d96b75bf6-", Namespace:"calico-system", SelfLink:"", UID:"78fd55e7-a426-4517-9f4e-3ebf38e4734d", ResourceVersion:"1056", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 19, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d96b75bf6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-b-c50fddf147.novalocal", ContainerID:"02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4", Pod:"calico-kube-controllers-d96b75bf6-cdxnt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.50.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calieb83e0be233", MAC:"56:f5:d3:20:b6:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:19:28.699844 containerd[1475]: 2025-03-20 22:19:28.693 [INFO][5374] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" Namespace="calico-system" Pod="calico-kube-controllers-d96b75bf6-cdxnt" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--d96b75bf6--cdxnt-eth0" Mar 20 22:19:28.755662 containerd[1475]: time="2025-03-20T22:19:28.755617895Z" level=info msg="connecting to shim 02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4" address="unix:///run/containerd/s/63ad9c312ff3d5cc7c866e6ed20e1b0540deda511e26d6997e97e28e76d32ff6" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:19:28.795677 systemd[1]: Started cri-containerd-02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4.scope - libcontainer container 02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4. Mar 20 22:19:28.870829 kubelet[2812]: I0320 22:19:28.870744 2812 topology_manager.go:215] "Topology Admit Handler" podUID="744f8fc6-62a0-4f26-b762-4915ce87d466" podNamespace="calico-system" podName="calico-typha-65c9d5fcd-nl475" Mar 20 22:19:28.884492 systemd[1]: Created slice kubepods-besteffort-pod744f8fc6_62a0_4f26_b762_4915ce87d466.slice - libcontainer container kubepods-besteffort-pod744f8fc6_62a0_4f26_b762_4915ce87d466.slice. Mar 20 22:19:28.891609 containerd[1475]: time="2025-03-20T22:19:28.891510962Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d96b75bf6-cdxnt,Uid:78fd55e7-a426-4517-9f4e-3ebf38e4734d,Namespace:calico-system,Attempt:0,} returns sandbox id \"02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4\"" Mar 20 22:19:28.908528 containerd[1475]: time="2025-03-20T22:19:28.908037798Z" level=info msg="CreateContainer within sandbox \"02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 22:19:28.920183 containerd[1475]: time="2025-03-20T22:19:28.920094552Z" level=info msg="Container 07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:28.934487 kubelet[2812]: I0320 22:19:28.934422 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5rw\" (UniqueName: \"kubernetes.io/projected/744f8fc6-62a0-4f26-b762-4915ce87d466-kube-api-access-tv5rw\") pod \"calico-typha-65c9d5fcd-nl475\" (UID: \"744f8fc6-62a0-4f26-b762-4915ce87d466\") " pod="calico-system/calico-typha-65c9d5fcd-nl475" Mar 20 22:19:28.934487 kubelet[2812]: I0320 22:19:28.934467 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/744f8fc6-62a0-4f26-b762-4915ce87d466-typha-certs\") pod \"calico-typha-65c9d5fcd-nl475\" (UID: \"744f8fc6-62a0-4f26-b762-4915ce87d466\") " pod="calico-system/calico-typha-65c9d5fcd-nl475" Mar 20 22:19:28.934655 kubelet[2812]: I0320 22:19:28.934520 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/744f8fc6-62a0-4f26-b762-4915ce87d466-tigera-ca-bundle\") pod \"calico-typha-65c9d5fcd-nl475\" (UID: \"744f8fc6-62a0-4f26-b762-4915ce87d466\") " pod="calico-system/calico-typha-65c9d5fcd-nl475" Mar 20 22:19:28.943087 kubelet[2812]: I0320 22:19:28.943030 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-tpslj" podStartSLOduration=3.943012086 podStartE2EDuration="3.943012086s" podCreationTimestamp="2025-03-20 22:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:19:28.940026481 +0000 UTC m=+85.997208558" watchObservedRunningTime="2025-03-20 22:19:28.943012086 +0000 UTC m=+86.000194163" Mar 20 22:19:28.946311 containerd[1475]: time="2025-03-20T22:19:28.946243142Z" level=info msg="CreateContainer within sandbox \"02f327eb37575c0bfa2bdc1db29a20a1519615be8220e4709462288c52ae31d4\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\"" Mar 20 22:19:28.948660 containerd[1475]: time="2025-03-20T22:19:28.948295154Z" level=info msg="StartContainer for \"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\"" Mar 20 22:19:28.949607 containerd[1475]: time="2025-03-20T22:19:28.949445414Z" level=info msg="connecting to shim 07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f" address="unix:///run/containerd/s/63ad9c312ff3d5cc7c866e6ed20e1b0540deda511e26d6997e97e28e76d32ff6" protocol=ttrpc version=3 Mar 20 22:19:28.985965 systemd[1]: Started cri-containerd-07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f.scope - libcontainer container 07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f. Mar 20 22:19:29.082570 containerd[1475]: time="2025-03-20T22:19:29.081998857Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\" id:\"0e74687699a1ca4dd2edd829a96ffa535d4cbbb4c570dbc03032e3b46c1bf42f\" pid:5468 exit_status:1 exited_at:{seconds:1742509169 nanos:81141498}" Mar 20 22:19:29.127491 containerd[1475]: time="2025-03-20T22:19:29.127436708Z" level=info msg="StartContainer for \"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\" returns successfully" Mar 20 22:19:29.192506 containerd[1475]: time="2025-03-20T22:19:29.192358654Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65c9d5fcd-nl475,Uid:744f8fc6-62a0-4f26-b762-4915ce87d466,Namespace:calico-system,Attempt:0,}" Mar 20 22:19:29.218096 containerd[1475]: time="2025-03-20T22:19:29.218059312Z" level=info msg="connecting to shim ea7114e7102f0d949c99cb4fb95b640505e604933351da1d37bd62af8680c871" address="unix:///run/containerd/s/d838445b0874b726c9b1bbb324b1c07161a9c359cfdf0d375eeb66145a474f83" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:19:29.246671 systemd[1]: Started cri-containerd-ea7114e7102f0d949c99cb4fb95b640505e604933351da1d37bd62af8680c871.scope - libcontainer container ea7114e7102f0d949c99cb4fb95b640505e604933351da1d37bd62af8680c871. Mar 20 22:19:29.303981 containerd[1475]: time="2025-03-20T22:19:29.303923404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65c9d5fcd-nl475,Uid:744f8fc6-62a0-4f26-b762-4915ce87d466,Namespace:calico-system,Attempt:0,} returns sandbox id \"ea7114e7102f0d949c99cb4fb95b640505e604933351da1d37bd62af8680c871\"" Mar 20 22:19:29.313899 containerd[1475]: time="2025-03-20T22:19:29.313843106Z" level=info msg="CreateContainer within sandbox \"ea7114e7102f0d949c99cb4fb95b640505e604933351da1d37bd62af8680c871\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 22:19:29.325718 containerd[1475]: time="2025-03-20T22:19:29.325670589Z" level=info msg="Container a470af2311517d8564aa8db3c7ad27bd956aee199b3693a2e6617bf2088f0721: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:19:29.343676 containerd[1475]: time="2025-03-20T22:19:29.342937735Z" level=info msg="CreateContainer within sandbox \"ea7114e7102f0d949c99cb4fb95b640505e604933351da1d37bd62af8680c871\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a470af2311517d8564aa8db3c7ad27bd956aee199b3693a2e6617bf2088f0721\"" Mar 20 22:19:29.346072 containerd[1475]: time="2025-03-20T22:19:29.346022707Z" level=info msg="StartContainer for \"a470af2311517d8564aa8db3c7ad27bd956aee199b3693a2e6617bf2088f0721\"" Mar 20 22:19:29.347323 containerd[1475]: time="2025-03-20T22:19:29.347294845Z" level=info msg="connecting to shim a470af2311517d8564aa8db3c7ad27bd956aee199b3693a2e6617bf2088f0721" address="unix:///run/containerd/s/d838445b0874b726c9b1bbb324b1c07161a9c359cfdf0d375eeb66145a474f83" protocol=ttrpc version=3 Mar 20 22:19:29.372677 systemd[1]: Started cri-containerd-a470af2311517d8564aa8db3c7ad27bd956aee199b3693a2e6617bf2088f0721.scope - libcontainer container a470af2311517d8564aa8db3c7ad27bd956aee199b3693a2e6617bf2088f0721. Mar 20 22:19:29.469974 containerd[1475]: time="2025-03-20T22:19:29.469929315Z" level=info msg="StartContainer for \"a470af2311517d8564aa8db3c7ad27bd956aee199b3693a2e6617bf2088f0721\" returns successfully" Mar 20 22:19:29.788860 systemd-networkd[1384]: calieb83e0be233: Gained IPv6LL Mar 20 22:19:29.933447 kubelet[2812]: I0320 22:19:29.933352 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-65c9d5fcd-nl475" podStartSLOduration=5.933316167 podStartE2EDuration="5.933316167s" podCreationTimestamp="2025-03-20 22:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:19:29.931354203 +0000 UTC m=+86.988536290" watchObservedRunningTime="2025-03-20 22:19:29.933316167 +0000 UTC m=+86.990498304" Mar 20 22:19:29.985163 kubelet[2812]: I0320 22:19:29.985104 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-d96b75bf6-cdxnt" podStartSLOduration=4.985076655 podStartE2EDuration="4.985076655s" podCreationTimestamp="2025-03-20 22:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:19:29.963781488 +0000 UTC m=+87.020963625" watchObservedRunningTime="2025-03-20 22:19:29.985076655 +0000 UTC m=+87.042258742" Mar 20 22:19:30.053024 containerd[1475]: time="2025-03-20T22:19:30.052318225Z" level=info msg="TaskExit event in podsandbox handler container_id:\"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\" id:\"cdfe21956896721698cecf26de52192fc6c9b56f13bb141df66d6422f0f187a2\" pid:5709 exit_status:1 exited_at:{seconds:1742509170 nanos:51367020}" Mar 20 22:19:30.079453 containerd[1475]: time="2025-03-20T22:19:30.079270281Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\" id:\"4008377cef024aa2be37f14c004c97a7b795a6a68159f599f341ce6366c3cea2\" pid:5708 exit_status:1 exited_at:{seconds:1742509170 nanos:78846185}" Mar 20 22:19:30.987246 containerd[1475]: time="2025-03-20T22:19:30.987189368Z" level=info msg="TaskExit event in podsandbox handler container_id:\"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\" id:\"a8c7bb05b9b7e749e199c12023b96d398ab01adac4480d53875cdceab2245d45\" pid:5790 exit_status:1 exited_at:{seconds:1742509170 nanos:985799800}" Mar 20 22:19:43.296059 systemd[1]: Started sshd@9-172.24.4.53:22-172.24.4.1:51194.service - OpenSSH per-connection server daemon (172.24.4.1:51194). Mar 20 22:19:44.600532 sshd[5900]: Accepted publickey for core from 172.24.4.1 port 51194 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:19:44.604356 sshd-session[5900]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:19:44.618590 systemd-logind[1460]: New session 12 of user core. Mar 20 22:19:44.627775 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 20 22:19:45.374619 sshd[5902]: Connection closed by 172.24.4.1 port 51194 Mar 20 22:19:45.375673 sshd-session[5900]: pam_unix(sshd:session): session closed for user core Mar 20 22:19:45.382335 systemd[1]: sshd@9-172.24.4.53:22-172.24.4.1:51194.service: Deactivated successfully. Mar 20 22:19:45.386743 systemd[1]: session-12.scope: Deactivated successfully. Mar 20 22:19:45.390599 systemd-logind[1460]: Session 12 logged out. Waiting for processes to exit. Mar 20 22:19:45.393218 systemd-logind[1460]: Removed session 12. Mar 20 22:19:50.392825 systemd[1]: Started sshd@10-172.24.4.53:22-172.24.4.1:42360.service - OpenSSH per-connection server daemon (172.24.4.1:42360). Mar 20 22:19:51.933446 sshd[5917]: Accepted publickey for core from 172.24.4.1 port 42360 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:19:51.936400 sshd-session[5917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:19:51.947990 systemd-logind[1460]: New session 13 of user core. Mar 20 22:19:51.956807 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 20 22:19:52.718445 sshd[5927]: Connection closed by 172.24.4.1 port 42360 Mar 20 22:19:52.716959 sshd-session[5917]: pam_unix(sshd:session): session closed for user core Mar 20 22:19:52.726154 systemd-logind[1460]: Session 13 logged out. Waiting for processes to exit. Mar 20 22:19:52.726255 systemd[1]: sshd@10-172.24.4.53:22-172.24.4.1:42360.service: Deactivated successfully. Mar 20 22:19:52.731118 systemd[1]: session-13.scope: Deactivated successfully. Mar 20 22:19:52.735968 systemd-logind[1460]: Removed session 13. Mar 20 22:19:55.882895 containerd[1475]: time="2025-03-20T22:19:55.882797430Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\" id:\"384ab3830f12b20390e1d6a3840289207566a545c7feabb9e75cfb3607fd0c1c\" pid:5952 exited_at:{seconds:1742509195 nanos:882360530}" Mar 20 22:19:57.731933 systemd[1]: Started sshd@11-172.24.4.53:22-172.24.4.1:35178.service - OpenSSH per-connection server daemon (172.24.4.1:35178). Mar 20 22:19:58.613686 containerd[1475]: time="2025-03-20T22:19:58.613634029Z" level=info msg="TaskExit event in podsandbox handler container_id:\"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\" id:\"1265bde2aa2f83f416ae0af43415a1cfd4ebdd31e1bf6a7ab3d56089c3361b15\" pid:5983 exited_at:{seconds:1742509198 nanos:613167754}" Mar 20 22:19:59.262702 sshd[5968]: Accepted publickey for core from 172.24.4.1 port 35178 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:19:59.264540 sshd-session[5968]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:19:59.293145 systemd-logind[1460]: New session 14 of user core. Mar 20 22:19:59.304967 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 20 22:19:59.978817 sshd[5993]: Connection closed by 172.24.4.1 port 35178 Mar 20 22:19:59.980091 sshd-session[5968]: pam_unix(sshd:session): session closed for user core Mar 20 22:19:59.988392 systemd[1]: sshd@11-172.24.4.53:22-172.24.4.1:35178.service: Deactivated successfully. Mar 20 22:19:59.995006 systemd[1]: session-14.scope: Deactivated successfully. Mar 20 22:20:00.003119 systemd-logind[1460]: Session 14 logged out. Waiting for processes to exit. Mar 20 22:20:00.006914 systemd-logind[1460]: Removed session 14. Mar 20 22:20:03.053553 containerd[1475]: time="2025-03-20T22:20:03.053381730Z" level=info msg="StopPodSandbox for \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\"" Mar 20 22:20:03.055622 containerd[1475]: time="2025-03-20T22:20:03.054034064Z" level=info msg="TearDown network for sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" successfully" Mar 20 22:20:03.055622 containerd[1475]: time="2025-03-20T22:20:03.054081573Z" level=info msg="StopPodSandbox for \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" returns successfully" Mar 20 22:20:03.056410 containerd[1475]: time="2025-03-20T22:20:03.056311867Z" level=info msg="RemovePodSandbox for \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\"" Mar 20 22:20:03.056707 containerd[1475]: time="2025-03-20T22:20:03.056451560Z" level=info msg="Forcibly stopping sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\"" Mar 20 22:20:03.056937 containerd[1475]: time="2025-03-20T22:20:03.056801286Z" level=info msg="TearDown network for sandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" successfully" Mar 20 22:20:03.061827 containerd[1475]: time="2025-03-20T22:20:03.061764679Z" level=info msg="Ensure that sandbox 86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080 in task-service has been cleanup successfully" Mar 20 22:20:03.069823 containerd[1475]: time="2025-03-20T22:20:03.069546651Z" level=info msg="RemovePodSandbox \"86f4fef6d299efe5a5308f7d4ecaac53f0d398195874019573088a121e1cc080\" returns successfully" Mar 20 22:20:03.074126 containerd[1475]: time="2025-03-20T22:20:03.073960994Z" level=info msg="StopPodSandbox for \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\"" Mar 20 22:20:03.074642 containerd[1475]: time="2025-03-20T22:20:03.074230039Z" level=info msg="TearDown network for sandbox \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" successfully" Mar 20 22:20:03.074642 containerd[1475]: time="2025-03-20T22:20:03.074264012Z" level=info msg="StopPodSandbox for \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" returns successfully" Mar 20 22:20:03.082284 containerd[1475]: time="2025-03-20T22:20:03.082214771Z" level=info msg="RemovePodSandbox for \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\"" Mar 20 22:20:03.083255 containerd[1475]: time="2025-03-20T22:20:03.082290343Z" level=info msg="Forcibly stopping sandbox \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\"" Mar 20 22:20:03.083255 containerd[1475]: time="2025-03-20T22:20:03.082601126Z" level=info msg="TearDown network for sandbox \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" successfully" Mar 20 22:20:03.089222 containerd[1475]: time="2025-03-20T22:20:03.089064524Z" level=info msg="Ensure that sandbox 870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1 in task-service has been cleanup successfully" Mar 20 22:20:03.096793 containerd[1475]: time="2025-03-20T22:20:03.096736299Z" level=info msg="RemovePodSandbox \"870bf509db5dd8d4729b3f35ae97e19dc7b59036c1ce4807538f86cdbe9a10d1\" returns successfully" Mar 20 22:20:03.097422 containerd[1475]: time="2025-03-20T22:20:03.097377892Z" level=info msg="StopPodSandbox for \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\"" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.152 [WARNING][6021] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.152 [INFO][6021] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.152 [INFO][6021] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" iface="eth0" netns="" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.152 [INFO][6021] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.152 [INFO][6021] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.187 [INFO][6028] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.187 [INFO][6028] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.187 [INFO][6028] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.196 [WARNING][6028] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.197 [INFO][6028] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.200 [INFO][6028] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:20:03.203412 containerd[1475]: 2025-03-20 22:20:03.202 [INFO][6021] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.203891 containerd[1475]: time="2025-03-20T22:20:03.203458198Z" level=info msg="TearDown network for sandbox \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" successfully" Mar 20 22:20:03.203891 containerd[1475]: time="2025-03-20T22:20:03.203510806Z" level=info msg="StopPodSandbox for \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" returns successfully" Mar 20 22:20:03.204714 containerd[1475]: time="2025-03-20T22:20:03.204259642Z" level=info msg="RemovePodSandbox for \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\"" Mar 20 22:20:03.204714 containerd[1475]: time="2025-03-20T22:20:03.204295870Z" level=info msg="Forcibly stopping sandbox \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\"" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.264 [WARNING][6046] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" WorkloadEndpoint="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.264 [INFO][6046] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.264 [INFO][6046] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" iface="eth0" netns="" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.264 [INFO][6046] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.264 [INFO][6046] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.292 [INFO][6054] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.292 [INFO][6054] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.294 [INFO][6054] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.302 [WARNING][6054] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.302 [INFO][6054] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" HandleID="k8s-pod-network.e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Workload="ci--9999--0--2--b--c50fddf147.novalocal-k8s-calico--kube--controllers--59b6b788d9--v7kpt-eth0" Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.303 [INFO][6054] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:20:03.307685 containerd[1475]: 2025-03-20 22:20:03.304 [INFO][6046] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596" Mar 20 22:20:03.308098 containerd[1475]: time="2025-03-20T22:20:03.307704301Z" level=info msg="TearDown network for sandbox \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" successfully" Mar 20 22:20:03.310684 containerd[1475]: time="2025-03-20T22:20:03.310636964Z" level=info msg="Ensure that sandbox e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596 in task-service has been cleanup successfully" Mar 20 22:20:03.317036 containerd[1475]: time="2025-03-20T22:20:03.316782155Z" level=info msg="RemovePodSandbox \"e09a258af8190a8bc5206b05edc536ad470a75e14be66b176dae02f3c1eff596\" returns successfully" Mar 20 22:20:04.992727 systemd[1]: Started sshd@12-172.24.4.53:22-172.24.4.1:34236.service - OpenSSH per-connection server daemon (172.24.4.1:34236). Mar 20 22:20:06.018793 sshd[6062]: Accepted publickey for core from 172.24.4.1 port 34236 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:06.022454 sshd-session[6062]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:06.034288 systemd-logind[1460]: New session 15 of user core. Mar 20 22:20:06.040907 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 20 22:20:06.627660 sshd[6064]: Connection closed by 172.24.4.1 port 34236 Mar 20 22:20:06.630241 sshd-session[6062]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:06.646394 systemd[1]: sshd@12-172.24.4.53:22-172.24.4.1:34236.service: Deactivated successfully. Mar 20 22:20:06.652171 systemd[1]: session-15.scope: Deactivated successfully. Mar 20 22:20:06.655150 systemd-logind[1460]: Session 15 logged out. Waiting for processes to exit. Mar 20 22:20:06.662617 systemd[1]: Started sshd@13-172.24.4.53:22-172.24.4.1:34238.service - OpenSSH per-connection server daemon (172.24.4.1:34238). Mar 20 22:20:06.666440 systemd-logind[1460]: Removed session 15. Mar 20 22:20:08.265407 sshd[6076]: Accepted publickey for core from 172.24.4.1 port 34238 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:08.267853 sshd-session[6076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:08.281688 systemd-logind[1460]: New session 16 of user core. Mar 20 22:20:08.286788 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 20 22:20:09.207365 sshd[6079]: Connection closed by 172.24.4.1 port 34238 Mar 20 22:20:09.208227 sshd-session[6076]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:09.222323 systemd[1]: sshd@13-172.24.4.53:22-172.24.4.1:34238.service: Deactivated successfully. Mar 20 22:20:09.227464 systemd[1]: session-16.scope: Deactivated successfully. Mar 20 22:20:09.229251 systemd-logind[1460]: Session 16 logged out. Waiting for processes to exit. Mar 20 22:20:09.233917 systemd[1]: Started sshd@14-172.24.4.53:22-172.24.4.1:34240.service - OpenSSH per-connection server daemon (172.24.4.1:34240). Mar 20 22:20:09.236864 systemd-logind[1460]: Removed session 16. Mar 20 22:20:10.516056 sshd[6088]: Accepted publickey for core from 172.24.4.1 port 34240 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:10.518999 sshd-session[6088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:10.531204 systemd-logind[1460]: New session 17 of user core. Mar 20 22:20:10.537896 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 20 22:20:11.325145 sshd[6091]: Connection closed by 172.24.4.1 port 34240 Mar 20 22:20:11.325948 sshd-session[6088]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:11.332265 systemd[1]: sshd@14-172.24.4.53:22-172.24.4.1:34240.service: Deactivated successfully. Mar 20 22:20:11.337094 systemd[1]: session-17.scope: Deactivated successfully. Mar 20 22:20:11.342225 systemd-logind[1460]: Session 17 logged out. Waiting for processes to exit. Mar 20 22:20:11.344831 systemd-logind[1460]: Removed session 17. Mar 20 22:20:16.348069 systemd[1]: Started sshd@15-172.24.4.53:22-172.24.4.1:47826.service - OpenSSH per-connection server daemon (172.24.4.1:47826). Mar 20 22:20:17.718220 sshd[6115]: Accepted publickey for core from 172.24.4.1 port 47826 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:17.721007 sshd-session[6115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:17.733402 systemd-logind[1460]: New session 18 of user core. Mar 20 22:20:17.737821 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 20 22:20:18.621210 sshd[6117]: Connection closed by 172.24.4.1 port 47826 Mar 20 22:20:18.622524 sshd-session[6115]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:18.629443 systemd[1]: sshd@15-172.24.4.53:22-172.24.4.1:47826.service: Deactivated successfully. Mar 20 22:20:18.634230 systemd[1]: session-18.scope: Deactivated successfully. Mar 20 22:20:18.638974 systemd-logind[1460]: Session 18 logged out. Waiting for processes to exit. Mar 20 22:20:18.641584 systemd-logind[1460]: Removed session 18. Mar 20 22:20:23.642617 systemd[1]: Started sshd@16-172.24.4.53:22-172.24.4.1:34856.service - OpenSSH per-connection server daemon (172.24.4.1:34856). Mar 20 22:20:25.003764 sshd[6131]: Accepted publickey for core from 172.24.4.1 port 34856 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:25.005966 sshd-session[6131]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:25.014186 systemd-logind[1460]: New session 19 of user core. Mar 20 22:20:25.017631 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 20 22:20:25.733541 sshd[6133]: Connection closed by 172.24.4.1 port 34856 Mar 20 22:20:25.733712 sshd-session[6131]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:25.739694 systemd[1]: sshd@16-172.24.4.53:22-172.24.4.1:34856.service: Deactivated successfully. Mar 20 22:20:25.742265 systemd[1]: session-19.scope: Deactivated successfully. Mar 20 22:20:25.746118 systemd-logind[1460]: Session 19 logged out. Waiting for processes to exit. Mar 20 22:20:25.747834 systemd-logind[1460]: Removed session 19. Mar 20 22:20:25.888469 containerd[1475]: time="2025-03-20T22:20:25.888420870Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\" id:\"fa5410b1e051c2d8e1207430b9700840069d24ac86f3a7cdb4472244db34dfef\" pid:6158 exited_at:{seconds:1742509225 nanos:888054272}" Mar 20 22:20:28.625730 containerd[1475]: time="2025-03-20T22:20:28.625678210Z" level=info msg="TaskExit event in podsandbox handler container_id:\"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\" id:\"77307109ceda6bc9526741514b462fc437454825c4b94988fc52770ffad8390e\" pid:6192 exited_at:{seconds:1742509228 nanos:625354182}" Mar 20 22:20:28.632982 containerd[1475]: time="2025-03-20T22:20:28.632919746Z" level=info msg="TaskExit event in podsandbox handler container_id:\"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\" id:\"ef67fafc8dc817a524c8032462153b32cb11cec57f867c6fe4df2383a1b1b3e8\" pid:6202 exited_at:{seconds:1742509228 nanos:632692980}" Mar 20 22:20:30.752268 systemd[1]: Started sshd@17-172.24.4.53:22-172.24.4.1:34862.service - OpenSSH per-connection server daemon (172.24.4.1:34862). Mar 20 22:20:32.105385 sshd[6215]: Accepted publickey for core from 172.24.4.1 port 34862 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:32.108896 sshd-session[6215]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:32.121766 systemd-logind[1460]: New session 20 of user core. Mar 20 22:20:32.128783 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 20 22:20:32.848368 sshd[6217]: Connection closed by 172.24.4.1 port 34862 Mar 20 22:20:32.847880 sshd-session[6215]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:32.859657 systemd[1]: sshd@17-172.24.4.53:22-172.24.4.1:34862.service: Deactivated successfully. Mar 20 22:20:32.863345 systemd[1]: session-20.scope: Deactivated successfully. Mar 20 22:20:32.865438 systemd-logind[1460]: Session 20 logged out. Waiting for processes to exit. Mar 20 22:20:32.866713 systemd[1]: Started sshd@18-172.24.4.53:22-172.24.4.1:34874.service - OpenSSH per-connection server daemon (172.24.4.1:34874). Mar 20 22:20:32.871602 systemd-logind[1460]: Removed session 20. Mar 20 22:20:34.201958 sshd[6228]: Accepted publickey for core from 172.24.4.1 port 34874 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:34.213711 sshd-session[6228]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:34.228623 systemd-logind[1460]: New session 21 of user core. Mar 20 22:20:34.233832 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 20 22:20:35.307741 sshd[6231]: Connection closed by 172.24.4.1 port 34874 Mar 20 22:20:35.309101 sshd-session[6228]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:35.323174 systemd[1]: sshd@18-172.24.4.53:22-172.24.4.1:34874.service: Deactivated successfully. Mar 20 22:20:35.327182 systemd[1]: session-21.scope: Deactivated successfully. Mar 20 22:20:35.329249 systemd-logind[1460]: Session 21 logged out. Waiting for processes to exit. Mar 20 22:20:35.331674 systemd-logind[1460]: Removed session 21. Mar 20 22:20:35.334653 systemd[1]: Started sshd@19-172.24.4.53:22-172.24.4.1:45810.service - OpenSSH per-connection server daemon (172.24.4.1:45810). Mar 20 22:20:36.504207 sshd[6240]: Accepted publickey for core from 172.24.4.1 port 45810 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:36.507236 sshd-session[6240]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:36.519289 systemd-logind[1460]: New session 22 of user core. Mar 20 22:20:36.531778 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 20 22:20:39.663431 sshd[6243]: Connection closed by 172.24.4.1 port 45810 Mar 20 22:20:39.664874 sshd-session[6240]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:39.679253 systemd[1]: sshd@19-172.24.4.53:22-172.24.4.1:45810.service: Deactivated successfully. Mar 20 22:20:39.683730 systemd[1]: session-22.scope: Deactivated successfully. Mar 20 22:20:39.684390 systemd[1]: session-22.scope: Consumed 912ms CPU time, 71.4M memory peak. Mar 20 22:20:39.686701 systemd-logind[1460]: Session 22 logged out. Waiting for processes to exit. Mar 20 22:20:39.693226 systemd[1]: Started sshd@20-172.24.4.53:22-172.24.4.1:45826.service - OpenSSH per-connection server daemon (172.24.4.1:45826). Mar 20 22:20:39.696820 systemd-logind[1460]: Removed session 22. Mar 20 22:20:40.754191 sshd[6259]: Accepted publickey for core from 172.24.4.1 port 45826 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:40.756597 sshd-session[6259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:40.769612 systemd-logind[1460]: New session 23 of user core. Mar 20 22:20:40.778795 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 20 22:20:41.696799 sshd[6262]: Connection closed by 172.24.4.1 port 45826 Mar 20 22:20:41.699579 sshd-session[6259]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:41.713760 systemd[1]: sshd@20-172.24.4.53:22-172.24.4.1:45826.service: Deactivated successfully. Mar 20 22:20:41.717208 systemd[1]: session-23.scope: Deactivated successfully. Mar 20 22:20:41.721852 systemd-logind[1460]: Session 23 logged out. Waiting for processes to exit. Mar 20 22:20:41.725187 systemd[1]: Started sshd@21-172.24.4.53:22-172.24.4.1:45838.service - OpenSSH per-connection server daemon (172.24.4.1:45838). Mar 20 22:20:41.729464 systemd-logind[1460]: Removed session 23. Mar 20 22:20:43.104553 sshd[6270]: Accepted publickey for core from 172.24.4.1 port 45838 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:43.107375 sshd-session[6270]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:43.119039 systemd-logind[1460]: New session 24 of user core. Mar 20 22:20:43.129844 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 20 22:20:43.859701 sshd[6273]: Connection closed by 172.24.4.1 port 45838 Mar 20 22:20:43.860240 sshd-session[6270]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:43.866472 systemd[1]: sshd@21-172.24.4.53:22-172.24.4.1:45838.service: Deactivated successfully. Mar 20 22:20:43.871295 systemd[1]: session-24.scope: Deactivated successfully. Mar 20 22:20:43.875020 systemd-logind[1460]: Session 24 logged out. Waiting for processes to exit. Mar 20 22:20:43.878379 systemd-logind[1460]: Removed session 24. Mar 20 22:20:48.882993 systemd[1]: Started sshd@22-172.24.4.53:22-172.24.4.1:41758.service - OpenSSH per-connection server daemon (172.24.4.1:41758). Mar 20 22:20:50.203557 sshd[6290]: Accepted publickey for core from 172.24.4.1 port 41758 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:50.205714 sshd-session[6290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:50.216609 systemd-logind[1460]: New session 25 of user core. Mar 20 22:20:50.224788 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 20 22:20:51.040675 sshd[6292]: Connection closed by 172.24.4.1 port 41758 Mar 20 22:20:51.041859 sshd-session[6290]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:51.049663 systemd[1]: sshd@22-172.24.4.53:22-172.24.4.1:41758.service: Deactivated successfully. Mar 20 22:20:51.055968 systemd[1]: session-25.scope: Deactivated successfully. Mar 20 22:20:51.059415 systemd-logind[1460]: Session 25 logged out. Waiting for processes to exit. Mar 20 22:20:51.064008 systemd-logind[1460]: Removed session 25. Mar 20 22:20:55.853510 containerd[1475]: time="2025-03-20T22:20:55.853036158Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d4e269a654c4359606157890d656b4710437499051f40337f1113f05bd7ab7ed\" id:\"eae779062cd1462c2317e62a821c4b33826e561308fa89f805076a6e5b30504e\" pid:6322 exited_at:{seconds:1742509255 nanos:852624206}" Mar 20 22:20:56.057240 systemd[1]: Started sshd@23-172.24.4.53:22-172.24.4.1:47340.service - OpenSSH per-connection server daemon (172.24.4.1:47340). Mar 20 22:20:57.170493 sshd[6335]: Accepted publickey for core from 172.24.4.1 port 47340 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:20:57.171819 sshd-session[6335]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:20:57.177519 systemd-logind[1460]: New session 26 of user core. Mar 20 22:20:57.182527 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 20 22:20:57.812795 sshd[6337]: Connection closed by 172.24.4.1 port 47340 Mar 20 22:20:57.813399 sshd-session[6335]: pam_unix(sshd:session): session closed for user core Mar 20 22:20:57.818461 systemd[1]: sshd@23-172.24.4.53:22-172.24.4.1:47340.service: Deactivated successfully. Mar 20 22:20:57.821945 systemd[1]: session-26.scope: Deactivated successfully. Mar 20 22:20:57.827404 systemd-logind[1460]: Session 26 logged out. Waiting for processes to exit. Mar 20 22:20:57.828402 systemd-logind[1460]: Removed session 26. Mar 20 22:20:58.621315 containerd[1475]: time="2025-03-20T22:20:58.621165948Z" level=info msg="TaskExit event in podsandbox handler container_id:\"07bb2246e623915d85afe2227666ed87333e5003482655b1caacb3590e5edf1f\" id:\"0ca934f3c92a5e8f9db15b639f87df8e4c37992a0131edc2581c9b3b1d1d6ab8\" pid:6362 exited_at:{seconds:1742509258 nanos:619921863}" Mar 20 22:21:02.837985 systemd[1]: Started sshd@24-172.24.4.53:22-172.24.4.1:47354.service - OpenSSH per-connection server daemon (172.24.4.1:47354). Mar 20 22:21:04.035309 sshd[6372]: Accepted publickey for core from 172.24.4.1 port 47354 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:21:04.039130 sshd-session[6372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:21:04.056913 systemd-logind[1460]: New session 27 of user core. Mar 20 22:21:04.062799 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 20 22:21:04.629147 sshd[6376]: Connection closed by 172.24.4.1 port 47354 Mar 20 22:21:04.628931 sshd-session[6372]: pam_unix(sshd:session): session closed for user core Mar 20 22:21:04.636021 systemd[1]: sshd@24-172.24.4.53:22-172.24.4.1:47354.service: Deactivated successfully. Mar 20 22:21:04.640538 systemd[1]: session-27.scope: Deactivated successfully. Mar 20 22:21:04.644287 systemd-logind[1460]: Session 27 logged out. Waiting for processes to exit. Mar 20 22:21:04.646983 systemd-logind[1460]: Removed session 27. Mar 20 22:21:09.648241 systemd[1]: Started sshd@25-172.24.4.53:22-172.24.4.1:60680.service - OpenSSH per-connection server daemon (172.24.4.1:60680). Mar 20 22:21:10.921848 sshd[6404]: Accepted publickey for core from 172.24.4.1 port 60680 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:21:10.926269 sshd-session[6404]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:21:10.946115 systemd-logind[1460]: New session 28 of user core. Mar 20 22:21:10.952312 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 20 22:21:11.623819 sshd[6406]: Connection closed by 172.24.4.1 port 60680 Mar 20 22:21:11.624991 sshd-session[6404]: pam_unix(sshd:session): session closed for user core Mar 20 22:21:11.632705 systemd[1]: sshd@25-172.24.4.53:22-172.24.4.1:60680.service: Deactivated successfully. Mar 20 22:21:11.636989 systemd[1]: session-28.scope: Deactivated successfully. Mar 20 22:21:11.640254 systemd-logind[1460]: Session 28 logged out. Waiting for processes to exit. Mar 20 22:21:11.642892 systemd-logind[1460]: Removed session 28. Mar 20 22:21:16.645997 systemd[1]: Started sshd@26-172.24.4.53:22-172.24.4.1:34942.service - OpenSSH per-connection server daemon (172.24.4.1:34942). Mar 20 22:21:18.026255 sshd[6418]: Accepted publickey for core from 172.24.4.1 port 34942 ssh2: RSA SHA256:v9M+sX31ENVxGVhQn7Li6Q7WTwfafWhY8vipY5BeRTI Mar 20 22:21:18.029440 sshd-session[6418]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:21:18.042598 systemd-logind[1460]: New session 29 of user core. Mar 20 22:21:18.051854 systemd[1]: Started session-29.scope - Session 29 of User core. Mar 20 22:21:18.629404 sshd[6420]: Connection closed by 172.24.4.1 port 34942 Mar 20 22:21:18.629696 sshd-session[6418]: pam_unix(sshd:session): session closed for user core Mar 20 22:21:18.636267 systemd[1]: sshd@26-172.24.4.53:22-172.24.4.1:34942.service: Deactivated successfully. Mar 20 22:21:18.639559 systemd[1]: session-29.scope: Deactivated successfully. Mar 20 22:21:18.640695 systemd-logind[1460]: Session 29 logged out. Waiting for processes to exit. Mar 20 22:21:18.641847 systemd-logind[1460]: Removed session 29.