Mar 21 14:08:54.032085 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Mar 21 10:52:59 -00 2025 Mar 21 14:08:54.032120 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=fb715041d083099c6a15c8aee7cc93fc3f3ca8764fc0aaaff245a06641d663d2 Mar 21 14:08:54.032133 kernel: BIOS-provided physical RAM map: Mar 21 14:08:54.032143 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 21 14:08:54.032152 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 21 14:08:54.032163 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 21 14:08:54.032175 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable Mar 21 14:08:54.032187 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved Mar 21 14:08:54.032199 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 21 14:08:54.032212 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 21 14:08:54.032224 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable Mar 21 14:08:54.032236 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 21 14:08:54.032247 kernel: NX (Execute Disable) protection: active Mar 21 14:08:54.032261 kernel: APIC: Static calls initialized Mar 21 14:08:54.032282 kernel: SMBIOS 3.0.0 present. Mar 21 14:08:54.032291 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 Mar 21 14:08:54.032300 kernel: Hypervisor detected: KVM Mar 21 14:08:54.032309 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 21 14:08:54.032318 kernel: kvm-clock: using sched offset of 3690074439 cycles Mar 21 14:08:54.032327 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 21 14:08:54.032339 kernel: tsc: Detected 1996.249 MHz processor Mar 21 14:08:54.032348 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 21 14:08:54.032357 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 21 14:08:54.032367 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 Mar 21 14:08:54.032376 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 21 14:08:54.032385 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 21 14:08:54.032394 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 Mar 21 14:08:54.032403 kernel: ACPI: Early table checksum verification disabled Mar 21 14:08:54.032414 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) Mar 21 14:08:54.032423 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 14:08:54.032432 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 14:08:54.032441 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 14:08:54.032450 kernel: ACPI: FACS 0x00000000BFFE0000 000040 Mar 21 14:08:54.032459 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 14:08:54.032468 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 14:08:54.032477 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] Mar 21 14:08:54.032486 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] Mar 21 14:08:54.032496 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] Mar 21 14:08:54.032505 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] Mar 21 14:08:54.032515 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] Mar 21 14:08:54.032527 kernel: No NUMA configuration found Mar 21 14:08:54.032536 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] Mar 21 14:08:54.032545 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] Mar 21 14:08:54.032555 kernel: Zone ranges: Mar 21 14:08:54.032566 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 21 14:08:54.032575 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 21 14:08:54.032584 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] Mar 21 14:08:54.032594 kernel: Movable zone start for each node Mar 21 14:08:54.032603 kernel: Early memory node ranges Mar 21 14:08:54.032612 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 21 14:08:54.032622 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] Mar 21 14:08:54.032632 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] Mar 21 14:08:54.032643 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] Mar 21 14:08:54.032652 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 21 14:08:54.032661 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 21 14:08:54.032671 kernel: On node 0, zone Normal: 35 pages in unavailable ranges Mar 21 14:08:54.032680 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 21 14:08:54.032689 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 21 14:08:54.032699 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 21 14:08:54.032708 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 21 14:08:54.032718 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 21 14:08:54.032729 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 21 14:08:54.032738 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 21 14:08:54.032748 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 21 14:08:54.032757 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 21 14:08:54.032766 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 21 14:08:54.032776 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 21 14:08:54.032787 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Mar 21 14:08:54.032796 kernel: Booting paravirtualized kernel on KVM Mar 21 14:08:54.032805 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 21 14:08:54.032816 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 21 14:08:54.032825 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 21 14:08:54.032833 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 21 14:08:54.032842 kernel: pcpu-alloc: [0] 0 1 Mar 21 14:08:54.032850 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 21 14:08:54.032860 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=fb715041d083099c6a15c8aee7cc93fc3f3ca8764fc0aaaff245a06641d663d2 Mar 21 14:08:54.032870 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 21 14:08:54.032879 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 21 14:08:54.032889 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 21 14:08:54.032898 kernel: Fallback order for Node 0: 0 Mar 21 14:08:54.032907 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 Mar 21 14:08:54.032915 kernel: Policy zone: Normal Mar 21 14:08:54.032924 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 21 14:08:54.032932 kernel: software IO TLB: area num 2. Mar 21 14:08:54.032941 kernel: Memory: 3962108K/4193772K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43588K init, 1476K bss, 231404K reserved, 0K cma-reserved) Mar 21 14:08:54.032950 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 21 14:08:54.032959 kernel: ftrace: allocating 37985 entries in 149 pages Mar 21 14:08:54.032969 kernel: ftrace: allocated 149 pages with 4 groups Mar 21 14:08:54.032978 kernel: Dynamic Preempt: voluntary Mar 21 14:08:54.032987 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 21 14:08:54.032996 kernel: rcu: RCU event tracing is enabled. Mar 21 14:08:54.033005 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 21 14:08:54.033014 kernel: Trampoline variant of Tasks RCU enabled. Mar 21 14:08:54.033023 kernel: Rude variant of Tasks RCU enabled. Mar 21 14:08:54.033032 kernel: Tracing variant of Tasks RCU enabled. Mar 21 14:08:54.033041 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 21 14:08:54.033065 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 21 14:08:54.033074 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 21 14:08:54.033083 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 21 14:08:54.033091 kernel: Console: colour VGA+ 80x25 Mar 21 14:08:54.033100 kernel: printk: console [tty0] enabled Mar 21 14:08:54.033109 kernel: printk: console [ttyS0] enabled Mar 21 14:08:54.033117 kernel: ACPI: Core revision 20230628 Mar 21 14:08:54.033126 kernel: APIC: Switch to symmetric I/O mode setup Mar 21 14:08:54.033135 kernel: x2apic enabled Mar 21 14:08:54.033146 kernel: APIC: Switched APIC routing to: physical x2apic Mar 21 14:08:54.033155 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 21 14:08:54.033164 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 21 14:08:54.033173 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Mar 21 14:08:54.033182 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 21 14:08:54.033190 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 21 14:08:54.033199 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 21 14:08:54.033208 kernel: Spectre V2 : Mitigation: Retpolines Mar 21 14:08:54.033217 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 21 14:08:54.033227 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 21 14:08:54.033236 kernel: Speculative Store Bypass: Vulnerable Mar 21 14:08:54.033245 kernel: x86/fpu: x87 FPU will use FXSAVE Mar 21 14:08:54.033254 kernel: Freeing SMP alternatives memory: 32K Mar 21 14:08:54.033268 kernel: pid_max: default: 32768 minimum: 301 Mar 21 14:08:54.033279 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 21 14:08:54.033288 kernel: landlock: Up and running. Mar 21 14:08:54.033297 kernel: SELinux: Initializing. Mar 21 14:08:54.033306 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 21 14:08:54.033315 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 21 14:08:54.033325 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Mar 21 14:08:54.033334 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 21 14:08:54.033345 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 21 14:08:54.033354 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 21 14:08:54.033363 kernel: Performance Events: AMD PMU driver. Mar 21 14:08:54.033372 kernel: ... version: 0 Mar 21 14:08:54.033381 kernel: ... bit width: 48 Mar 21 14:08:54.033392 kernel: ... generic registers: 4 Mar 21 14:08:54.033401 kernel: ... value mask: 0000ffffffffffff Mar 21 14:08:54.033410 kernel: ... max period: 00007fffffffffff Mar 21 14:08:54.033419 kernel: ... fixed-purpose events: 0 Mar 21 14:08:54.033428 kernel: ... event mask: 000000000000000f Mar 21 14:08:54.033438 kernel: signal: max sigframe size: 1440 Mar 21 14:08:54.033447 kernel: rcu: Hierarchical SRCU implementation. Mar 21 14:08:54.033456 kernel: rcu: Max phase no-delay instances is 400. Mar 21 14:08:54.033465 kernel: smp: Bringing up secondary CPUs ... Mar 21 14:08:54.033476 kernel: smpboot: x86: Booting SMP configuration: Mar 21 14:08:54.033485 kernel: .... node #0, CPUs: #1 Mar 21 14:08:54.033494 kernel: smp: Brought up 1 node, 2 CPUs Mar 21 14:08:54.033503 kernel: smpboot: Max logical packages: 2 Mar 21 14:08:54.033512 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Mar 21 14:08:54.033521 kernel: devtmpfs: initialized Mar 21 14:08:54.033530 kernel: x86/mm: Memory block size: 128MB Mar 21 14:08:54.033540 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 21 14:08:54.033549 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 21 14:08:54.033560 kernel: pinctrl core: initialized pinctrl subsystem Mar 21 14:08:54.033569 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 21 14:08:54.033578 kernel: audit: initializing netlink subsys (disabled) Mar 21 14:08:54.033587 kernel: audit: type=2000 audit(1742566133.119:1): state=initialized audit_enabled=0 res=1 Mar 21 14:08:54.033596 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 21 14:08:54.033605 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 21 14:08:54.033614 kernel: cpuidle: using governor menu Mar 21 14:08:54.033623 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 21 14:08:54.033632 kernel: dca service started, version 1.12.1 Mar 21 14:08:54.033643 kernel: PCI: Using configuration type 1 for base access Mar 21 14:08:54.033653 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 21 14:08:54.033662 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 21 14:08:54.033671 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 21 14:08:54.033680 kernel: ACPI: Added _OSI(Module Device) Mar 21 14:08:54.033689 kernel: ACPI: Added _OSI(Processor Device) Mar 21 14:08:54.033698 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 21 14:08:54.033707 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 21 14:08:54.033717 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 21 14:08:54.033727 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 21 14:08:54.033736 kernel: ACPI: Interpreter enabled Mar 21 14:08:54.033745 kernel: ACPI: PM: (supports S0 S3 S5) Mar 21 14:08:54.033754 kernel: ACPI: Using IOAPIC for interrupt routing Mar 21 14:08:54.033763 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 21 14:08:54.033773 kernel: PCI: Using E820 reservations for host bridge windows Mar 21 14:08:54.033782 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Mar 21 14:08:54.033791 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 21 14:08:54.033931 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Mar 21 14:08:54.034033 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Mar 21 14:08:54.034147 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Mar 21 14:08:54.034162 kernel: acpiphp: Slot [3] registered Mar 21 14:08:54.034172 kernel: acpiphp: Slot [4] registered Mar 21 14:08:54.034181 kernel: acpiphp: Slot [5] registered Mar 21 14:08:54.034190 kernel: acpiphp: Slot [6] registered Mar 21 14:08:54.034199 kernel: acpiphp: Slot [7] registered Mar 21 14:08:54.034212 kernel: acpiphp: Slot [8] registered Mar 21 14:08:54.034221 kernel: acpiphp: Slot [9] registered Mar 21 14:08:54.034230 kernel: acpiphp: Slot [10] registered Mar 21 14:08:54.034239 kernel: acpiphp: Slot [11] registered Mar 21 14:08:54.034248 kernel: acpiphp: Slot [12] registered Mar 21 14:08:54.034257 kernel: acpiphp: Slot [13] registered Mar 21 14:08:54.034266 kernel: acpiphp: Slot [14] registered Mar 21 14:08:54.034275 kernel: acpiphp: Slot [15] registered Mar 21 14:08:54.034284 kernel: acpiphp: Slot [16] registered Mar 21 14:08:54.034293 kernel: acpiphp: Slot [17] registered Mar 21 14:08:54.034304 kernel: acpiphp: Slot [18] registered Mar 21 14:08:54.034313 kernel: acpiphp: Slot [19] registered Mar 21 14:08:54.034322 kernel: acpiphp: Slot [20] registered Mar 21 14:08:54.034331 kernel: acpiphp: Slot [21] registered Mar 21 14:08:54.034340 kernel: acpiphp: Slot [22] registered Mar 21 14:08:54.034349 kernel: acpiphp: Slot [23] registered Mar 21 14:08:54.034358 kernel: acpiphp: Slot [24] registered Mar 21 14:08:54.034367 kernel: acpiphp: Slot [25] registered Mar 21 14:08:54.034376 kernel: acpiphp: Slot [26] registered Mar 21 14:08:54.034386 kernel: acpiphp: Slot [27] registered Mar 21 14:08:54.034395 kernel: acpiphp: Slot [28] registered Mar 21 14:08:54.034404 kernel: acpiphp: Slot [29] registered Mar 21 14:08:54.034413 kernel: acpiphp: Slot [30] registered Mar 21 14:08:54.034422 kernel: acpiphp: Slot [31] registered Mar 21 14:08:54.034431 kernel: PCI host bridge to bus 0000:00 Mar 21 14:08:54.034527 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 21 14:08:54.034614 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 21 14:08:54.034698 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 21 14:08:54.034786 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 21 14:08:54.034867 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] Mar 21 14:08:54.034949 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 21 14:08:54.035078 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Mar 21 14:08:54.035185 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Mar 21 14:08:54.035289 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Mar 21 14:08:54.035389 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Mar 21 14:08:54.035481 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 21 14:08:54.035574 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 21 14:08:54.035667 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 21 14:08:54.035759 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 21 14:08:54.035866 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Mar 21 14:08:54.035998 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Mar 21 14:08:54.036124 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Mar 21 14:08:54.036236 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Mar 21 14:08:54.036338 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Mar 21 14:08:54.036440 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 21 14:08:54.036541 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Mar 21 14:08:54.036642 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Mar 21 14:08:54.036838 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 21 14:08:54.036955 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 21 14:08:54.037537 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Mar 21 14:08:54.037651 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Mar 21 14:08:54.037753 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] Mar 21 14:08:54.037849 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Mar 21 14:08:54.037957 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 21 14:08:54.038091 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 21 14:08:54.040150 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Mar 21 14:08:54.040262 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] Mar 21 14:08:54.040375 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Mar 21 14:08:54.040479 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Mar 21 14:08:54.040580 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] Mar 21 14:08:54.040689 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Mar 21 14:08:54.040798 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Mar 21 14:08:54.040900 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] Mar 21 14:08:54.041002 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] Mar 21 14:08:54.041017 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 21 14:08:54.041028 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 21 14:08:54.041038 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 21 14:08:54.041072 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 21 14:08:54.041083 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Mar 21 14:08:54.041097 kernel: iommu: Default domain type: Translated Mar 21 14:08:54.041107 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 21 14:08:54.041117 kernel: PCI: Using ACPI for IRQ routing Mar 21 14:08:54.041127 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 21 14:08:54.041137 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 21 14:08:54.041147 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] Mar 21 14:08:54.041249 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Mar 21 14:08:54.041349 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Mar 21 14:08:54.041448 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 21 14:08:54.041467 kernel: vgaarb: loaded Mar 21 14:08:54.041477 kernel: clocksource: Switched to clocksource kvm-clock Mar 21 14:08:54.041487 kernel: VFS: Disk quotas dquot_6.6.0 Mar 21 14:08:54.041497 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 21 14:08:54.041507 kernel: pnp: PnP ACPI init Mar 21 14:08:54.041611 kernel: pnp 00:03: [dma 2] Mar 21 14:08:54.041628 kernel: pnp: PnP ACPI: found 5 devices Mar 21 14:08:54.041638 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 21 14:08:54.041652 kernel: NET: Registered PF_INET protocol family Mar 21 14:08:54.041662 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 21 14:08:54.041672 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 21 14:08:54.041682 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 21 14:08:54.041692 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 21 14:08:54.041702 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 21 14:08:54.041712 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 21 14:08:54.041722 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 21 14:08:54.041732 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 21 14:08:54.041744 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 21 14:08:54.041754 kernel: NET: Registered PF_XDP protocol family Mar 21 14:08:54.041844 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 21 14:08:54.043285 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 21 14:08:54.043378 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 21 14:08:54.043462 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Mar 21 14:08:54.043543 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] Mar 21 14:08:54.043639 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Mar 21 14:08:54.043740 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 21 14:08:54.043755 kernel: PCI: CLS 0 bytes, default 64 Mar 21 14:08:54.043765 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 21 14:08:54.043774 kernel: software IO TLB: mapped [mem 0x00000000b5000000-0x00000000b9000000] (64MB) Mar 21 14:08:54.043784 kernel: Initialise system trusted keyrings Mar 21 14:08:54.043793 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 21 14:08:54.043803 kernel: Key type asymmetric registered Mar 21 14:08:54.043812 kernel: Asymmetric key parser 'x509' registered Mar 21 14:08:54.043821 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 21 14:08:54.043833 kernel: io scheduler mq-deadline registered Mar 21 14:08:54.043843 kernel: io scheduler kyber registered Mar 21 14:08:54.043852 kernel: io scheduler bfq registered Mar 21 14:08:54.043861 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 21 14:08:54.043871 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Mar 21 14:08:54.043881 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Mar 21 14:08:54.043908 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Mar 21 14:08:54.043922 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Mar 21 14:08:54.043934 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 21 14:08:54.043947 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 21 14:08:54.043956 kernel: random: crng init done Mar 21 14:08:54.043965 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 21 14:08:54.043975 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 21 14:08:54.043984 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 21 14:08:54.044106 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 21 14:08:54.044124 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 21 14:08:54.044213 kernel: rtc_cmos 00:04: registered as rtc0 Mar 21 14:08:54.044311 kernel: rtc_cmos 00:04: setting system clock to 2025-03-21T14:08:53 UTC (1742566133) Mar 21 14:08:54.044401 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Mar 21 14:08:54.044416 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 21 14:08:54.044426 kernel: NET: Registered PF_INET6 protocol family Mar 21 14:08:54.044436 kernel: Segment Routing with IPv6 Mar 21 14:08:54.044446 kernel: In-situ OAM (IOAM) with IPv6 Mar 21 14:08:54.044456 kernel: NET: Registered PF_PACKET protocol family Mar 21 14:08:54.044465 kernel: Key type dns_resolver registered Mar 21 14:08:54.044475 kernel: IPI shorthand broadcast: enabled Mar 21 14:08:54.044489 kernel: sched_clock: Marking stable (1005007422, 169578703)->(1211968838, -37382713) Mar 21 14:08:54.044499 kernel: registered taskstats version 1 Mar 21 14:08:54.044508 kernel: Loading compiled-in X.509 certificates Mar 21 14:08:54.044518 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: d76f2258ffed89096a9428010e5ac0a0babcea9e' Mar 21 14:08:54.044528 kernel: Key type .fscrypt registered Mar 21 14:08:54.044538 kernel: Key type fscrypt-provisioning registered Mar 21 14:08:54.044548 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 21 14:08:54.044558 kernel: ima: Allocated hash algorithm: sha1 Mar 21 14:08:54.044569 kernel: ima: No architecture policies found Mar 21 14:08:54.044579 kernel: clk: Disabling unused clocks Mar 21 14:08:54.044589 kernel: Freeing unused kernel image (initmem) memory: 43588K Mar 21 14:08:54.044599 kernel: Write protecting the kernel read-only data: 40960k Mar 21 14:08:54.044609 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 21 14:08:54.044619 kernel: Run /init as init process Mar 21 14:08:54.044628 kernel: with arguments: Mar 21 14:08:54.044638 kernel: /init Mar 21 14:08:54.044648 kernel: with environment: Mar 21 14:08:54.044657 kernel: HOME=/ Mar 21 14:08:54.044669 kernel: TERM=linux Mar 21 14:08:54.044678 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 21 14:08:54.044690 systemd[1]: Successfully made /usr/ read-only. Mar 21 14:08:54.044704 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 21 14:08:54.044715 systemd[1]: Detected virtualization kvm. Mar 21 14:08:54.044726 systemd[1]: Detected architecture x86-64. Mar 21 14:08:54.044736 systemd[1]: Running in initrd. Mar 21 14:08:54.044749 systemd[1]: No hostname configured, using default hostname. Mar 21 14:08:54.044760 systemd[1]: Hostname set to . Mar 21 14:08:54.044770 systemd[1]: Initializing machine ID from VM UUID. Mar 21 14:08:54.044781 systemd[1]: Queued start job for default target initrd.target. Mar 21 14:08:54.044792 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 14:08:54.044803 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 14:08:54.044822 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 21 14:08:54.044836 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 21 14:08:54.044847 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 21 14:08:54.044858 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 21 14:08:54.044870 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 21 14:08:54.044882 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 21 14:08:54.044895 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 14:08:54.044906 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 21 14:08:54.044916 systemd[1]: Reached target paths.target - Path Units. Mar 21 14:08:54.044927 systemd[1]: Reached target slices.target - Slice Units. Mar 21 14:08:54.044938 systemd[1]: Reached target swap.target - Swaps. Mar 21 14:08:54.044949 systemd[1]: Reached target timers.target - Timer Units. Mar 21 14:08:54.044960 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 21 14:08:54.044971 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 21 14:08:54.044982 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 21 14:08:54.044995 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 21 14:08:54.045006 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 21 14:08:54.045016 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 21 14:08:54.045027 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 14:08:54.045038 systemd[1]: Reached target sockets.target - Socket Units. Mar 21 14:08:54.045064 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 21 14:08:54.045076 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 21 14:08:54.045087 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 21 14:08:54.045100 systemd[1]: Starting systemd-fsck-usr.service... Mar 21 14:08:54.045111 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 21 14:08:54.045122 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 21 14:08:54.045133 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 14:08:54.045166 systemd-journald[185]: Collecting audit messages is disabled. Mar 21 14:08:54.045196 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 21 14:08:54.045208 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 14:08:54.045224 systemd-journald[185]: Journal started Mar 21 14:08:54.045251 systemd-journald[185]: Runtime Journal (/run/log/journal/0f2be3ad65354ba5b65a86cdb7f3b982) is 8M, max 78.2M, 70.2M free. Mar 21 14:08:54.042385 systemd-modules-load[187]: Inserted module 'overlay' Mar 21 14:08:54.051075 systemd[1]: Started systemd-journald.service - Journal Service. Mar 21 14:08:54.054099 systemd[1]: Finished systemd-fsck-usr.service. Mar 21 14:08:54.097758 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 21 14:08:54.097787 kernel: Bridge firewalling registered Mar 21 14:08:54.076092 systemd-modules-load[187]: Inserted module 'br_netfilter' Mar 21 14:08:54.097476 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 21 14:08:54.099807 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 14:08:54.101707 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 14:08:54.103186 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 21 14:08:54.105914 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 21 14:08:54.112178 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 21 14:08:54.121641 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 21 14:08:54.126344 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 21 14:08:54.128547 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 21 14:08:54.129206 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 14:08:54.139164 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 21 14:08:54.141012 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 14:08:54.141747 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 14:08:54.148669 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 21 14:08:54.169158 dracut-cmdline[222]: dracut-dracut-053 Mar 21 14:08:54.172124 dracut-cmdline[222]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=fb715041d083099c6a15c8aee7cc93fc3f3ca8764fc0aaaff245a06641d663d2 Mar 21 14:08:54.178009 systemd-resolved[218]: Positive Trust Anchors: Mar 21 14:08:54.178027 systemd-resolved[218]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 21 14:08:54.178606 systemd-resolved[218]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 21 14:08:54.181499 systemd-resolved[218]: Defaulting to hostname 'linux'. Mar 21 14:08:54.182347 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 21 14:08:54.183375 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 21 14:08:54.251092 kernel: SCSI subsystem initialized Mar 21 14:08:54.262078 kernel: Loading iSCSI transport class v2.0-870. Mar 21 14:08:54.274080 kernel: iscsi: registered transport (tcp) Mar 21 14:08:54.297317 kernel: iscsi: registered transport (qla4xxx) Mar 21 14:08:54.297389 kernel: QLogic iSCSI HBA Driver Mar 21 14:08:54.331362 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 21 14:08:54.333197 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 21 14:08:54.369506 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 21 14:08:54.369553 kernel: device-mapper: uevent: version 1.0.3 Mar 21 14:08:54.371607 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 21 14:08:54.431123 kernel: raid6: sse2x4 gen() 5197 MB/s Mar 21 14:08:54.449159 kernel: raid6: sse2x2 gen() 6128 MB/s Mar 21 14:08:54.467434 kernel: raid6: sse2x1 gen() 9832 MB/s Mar 21 14:08:54.467497 kernel: raid6: using algorithm sse2x1 gen() 9832 MB/s Mar 21 14:08:54.486641 kernel: raid6: .... xor() 7349 MB/s, rmw enabled Mar 21 14:08:54.486706 kernel: raid6: using ssse3x2 recovery algorithm Mar 21 14:08:54.509698 kernel: xor: measuring software checksum speed Mar 21 14:08:54.509773 kernel: prefetch64-sse : 17240 MB/sec Mar 21 14:08:54.510242 kernel: generic_sse : 15688 MB/sec Mar 21 14:08:54.511402 kernel: xor: using function: prefetch64-sse (17240 MB/sec) Mar 21 14:08:54.685111 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 21 14:08:54.701263 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 21 14:08:54.706906 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 14:08:54.735022 systemd-udevd[405]: Using default interface naming scheme 'v255'. Mar 21 14:08:54.740039 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 14:08:54.747197 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 21 14:08:54.774328 dracut-pre-trigger[419]: rd.md=0: removing MD RAID activation Mar 21 14:08:54.813800 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 21 14:08:54.819847 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 21 14:08:54.879002 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 14:08:54.882533 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 21 14:08:54.912371 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 21 14:08:54.916369 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 21 14:08:54.918429 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 14:08:54.920350 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 21 14:08:54.923711 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 21 14:08:54.947333 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 21 14:08:54.985074 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Mar 21 14:08:55.017619 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) Mar 21 14:08:55.017742 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 21 14:08:55.017757 kernel: GPT:17805311 != 20971519 Mar 21 14:08:55.017769 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 21 14:08:55.017781 kernel: GPT:17805311 != 20971519 Mar 21 14:08:55.017792 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 21 14:08:55.017803 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 14:08:55.017815 kernel: libata version 3.00 loaded. Mar 21 14:08:55.003861 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 21 14:08:55.003931 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 14:08:55.004751 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 14:08:55.005447 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 21 14:08:55.005496 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 14:08:55.014610 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 14:08:55.016577 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 14:08:55.024982 kernel: ata_piix 0000:00:01.1: version 2.13 Mar 21 14:08:55.043432 kernel: scsi host0: ata_piix Mar 21 14:08:55.043643 kernel: scsi host1: ata_piix Mar 21 14:08:55.043969 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Mar 21 14:08:55.043984 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Mar 21 14:08:55.065073 kernel: BTRFS: device fsid c99b4410-5d95-4377-8189-88a588aa2514 devid 1 transid 38 /dev/vda3 scanned by (udev-worker) (452) Mar 21 14:08:55.071062 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (458) Mar 21 14:08:55.077905 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 21 14:08:55.104560 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 14:08:55.132326 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 21 14:08:55.144369 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 21 14:08:55.153974 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 21 14:08:55.155284 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 21 14:08:55.157609 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 21 14:08:55.159380 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 14:08:55.177943 disk-uuid[508]: Primary Header is updated. Mar 21 14:08:55.177943 disk-uuid[508]: Secondary Entries is updated. Mar 21 14:08:55.177943 disk-uuid[508]: Secondary Header is updated. Mar 21 14:08:55.182719 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 14:08:55.190190 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 14:08:56.207119 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 14:08:56.207995 disk-uuid[516]: The operation has completed successfully. Mar 21 14:08:56.291001 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 21 14:08:56.291202 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 21 14:08:56.340814 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 21 14:08:56.355638 sh[528]: Success Mar 21 14:08:56.373255 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Mar 21 14:08:56.428512 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 21 14:08:56.436125 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 21 14:08:56.437683 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 21 14:08:56.464567 kernel: BTRFS info (device dm-0): first mount of filesystem c99b4410-5d95-4377-8189-88a588aa2514 Mar 21 14:08:56.464688 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 21 14:08:56.466618 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 21 14:08:56.469989 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 21 14:08:56.470101 kernel: BTRFS info (device dm-0): using free space tree Mar 21 14:08:56.486403 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 21 14:08:56.488718 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 21 14:08:56.491719 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 21 14:08:56.498284 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 21 14:08:56.546133 kernel: BTRFS info (device vda6): first mount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 14:08:56.552288 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 21 14:08:56.552354 kernel: BTRFS info (device vda6): using free space tree Mar 21 14:08:56.562296 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 14:08:56.573126 kernel: BTRFS info (device vda6): last unmount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 14:08:56.583897 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 21 14:08:56.587694 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 21 14:08:56.672325 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 21 14:08:56.675152 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 21 14:08:56.709172 systemd-networkd[707]: lo: Link UP Mar 21 14:08:56.709818 systemd-networkd[707]: lo: Gained carrier Mar 21 14:08:56.711764 systemd-networkd[707]: Enumeration completed Mar 21 14:08:56.711833 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 21 14:08:56.712400 systemd[1]: Reached target network.target - Network. Mar 21 14:08:56.713304 systemd-networkd[707]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 14:08:56.713307 systemd-networkd[707]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 21 14:08:56.714229 systemd-networkd[707]: eth0: Link UP Mar 21 14:08:56.714233 systemd-networkd[707]: eth0: Gained carrier Mar 21 14:08:56.714239 systemd-networkd[707]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 14:08:56.729095 systemd-networkd[707]: eth0: DHCPv4 address 172.24.4.61/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 21 14:08:56.768585 ignition[633]: Ignition 2.20.0 Mar 21 14:08:56.768602 ignition[633]: Stage: fetch-offline Mar 21 14:08:56.768657 ignition[633]: no configs at "/usr/lib/ignition/base.d" Mar 21 14:08:56.770222 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 21 14:08:56.768672 ignition[633]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 14:08:56.768785 ignition[633]: parsed url from cmdline: "" Mar 21 14:08:56.768789 ignition[633]: no config URL provided Mar 21 14:08:56.768794 ignition[633]: reading system config file "/usr/lib/ignition/user.ign" Mar 21 14:08:56.768803 ignition[633]: no config at "/usr/lib/ignition/user.ign" Mar 21 14:08:56.774216 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 21 14:08:56.768807 ignition[633]: failed to fetch config: resource requires networking Mar 21 14:08:56.768989 ignition[633]: Ignition finished successfully Mar 21 14:08:56.798817 ignition[717]: Ignition 2.20.0 Mar 21 14:08:56.799390 ignition[717]: Stage: fetch Mar 21 14:08:56.799628 ignition[717]: no configs at "/usr/lib/ignition/base.d" Mar 21 14:08:56.799643 ignition[717]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 14:08:56.799739 ignition[717]: parsed url from cmdline: "" Mar 21 14:08:56.799743 ignition[717]: no config URL provided Mar 21 14:08:56.799751 ignition[717]: reading system config file "/usr/lib/ignition/user.ign" Mar 21 14:08:56.799760 ignition[717]: no config at "/usr/lib/ignition/user.ign" Mar 21 14:08:56.799911 ignition[717]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 21 14:08:56.800124 ignition[717]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 21 14:08:56.800163 ignition[717]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 21 14:08:56.949274 ignition[717]: GET result: OK Mar 21 14:08:56.949498 ignition[717]: parsing config with SHA512: 34cb4ece43c79369673518c31a05926129a60d51c85a55aec07a2b39caf003925bc717d4868629896fc68a8a55780fc41d108cd0f9b23183989073678059b712 Mar 21 14:08:56.960464 unknown[717]: fetched base config from "system" Mar 21 14:08:56.960493 unknown[717]: fetched base config from "system" Mar 21 14:08:56.961784 ignition[717]: fetch: fetch complete Mar 21 14:08:56.960508 unknown[717]: fetched user config from "openstack" Mar 21 14:08:56.961797 ignition[717]: fetch: fetch passed Mar 21 14:08:56.965397 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 21 14:08:56.961893 ignition[717]: Ignition finished successfully Mar 21 14:08:56.971147 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 21 14:08:57.012642 ignition[723]: Ignition 2.20.0 Mar 21 14:08:57.012667 ignition[723]: Stage: kargs Mar 21 14:08:57.013104 ignition[723]: no configs at "/usr/lib/ignition/base.d" Mar 21 14:08:57.013136 ignition[723]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 14:08:57.015428 ignition[723]: kargs: kargs passed Mar 21 14:08:57.017922 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 21 14:08:57.015526 ignition[723]: Ignition finished successfully Mar 21 14:08:57.023321 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 21 14:08:57.065441 ignition[730]: Ignition 2.20.0 Mar 21 14:08:57.067094 ignition[730]: Stage: disks Mar 21 14:08:57.067493 ignition[730]: no configs at "/usr/lib/ignition/base.d" Mar 21 14:08:57.067521 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 14:08:57.073930 ignition[730]: disks: disks passed Mar 21 14:08:57.074029 ignition[730]: Ignition finished successfully Mar 21 14:08:57.075847 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 21 14:08:57.079000 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 21 14:08:57.080947 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 21 14:08:57.083861 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 21 14:08:57.086813 systemd[1]: Reached target sysinit.target - System Initialization. Mar 21 14:08:57.089339 systemd[1]: Reached target basic.target - Basic System. Mar 21 14:08:57.094004 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 21 14:08:57.141962 systemd-fsck[738]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 21 14:08:57.158229 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 21 14:08:57.162917 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 21 14:08:57.312074 kernel: EXT4-fs (vda9): mounted filesystem c540419e-275b-4bd7-8ebd-24b19ec75c0b r/w with ordered data mode. Quota mode: none. Mar 21 14:08:57.312758 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 21 14:08:57.314218 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 21 14:08:57.317225 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 21 14:08:57.321122 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 21 14:08:57.322311 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 21 14:08:57.323781 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 21 14:08:57.324378 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 21 14:08:57.324406 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 21 14:08:57.337466 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 21 14:08:57.340183 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 21 14:08:57.359069 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (746) Mar 21 14:08:57.374933 kernel: BTRFS info (device vda6): first mount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 14:08:57.375036 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 21 14:08:57.378444 kernel: BTRFS info (device vda6): using free space tree Mar 21 14:08:57.384073 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 14:08:57.386340 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 21 14:08:57.500758 initrd-setup-root[776]: cut: /sysroot/etc/passwd: No such file or directory Mar 21 14:08:57.506684 initrd-setup-root[783]: cut: /sysroot/etc/group: No such file or directory Mar 21 14:08:57.510911 initrd-setup-root[790]: cut: /sysroot/etc/shadow: No such file or directory Mar 21 14:08:57.515489 initrd-setup-root[797]: cut: /sysroot/etc/gshadow: No such file or directory Mar 21 14:08:57.594183 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 21 14:08:57.595920 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 21 14:08:57.599166 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 21 14:08:57.610847 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 21 14:08:57.612618 kernel: BTRFS info (device vda6): last unmount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 14:08:57.637073 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 21 14:08:57.640366 ignition[864]: INFO : Ignition 2.20.0 Mar 21 14:08:57.640366 ignition[864]: INFO : Stage: mount Mar 21 14:08:57.641545 ignition[864]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 14:08:57.641545 ignition[864]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 14:08:57.643524 ignition[864]: INFO : mount: mount passed Mar 21 14:08:57.643524 ignition[864]: INFO : Ignition finished successfully Mar 21 14:08:57.644157 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 21 14:08:58.186450 systemd-networkd[707]: eth0: Gained IPv6LL Mar 21 14:09:04.553585 coreos-metadata[748]: Mar 21 14:09:04.553 WARN failed to locate config-drive, using the metadata service API instead Mar 21 14:09:04.595200 coreos-metadata[748]: Mar 21 14:09:04.595 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 21 14:09:04.614580 coreos-metadata[748]: Mar 21 14:09:04.614 INFO Fetch successful Mar 21 14:09:04.616198 coreos-metadata[748]: Mar 21 14:09:04.614 INFO wrote hostname ci-9999-0-3-a-8593155e6d.novalocal to /sysroot/etc/hostname Mar 21 14:09:04.618814 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 21 14:09:04.619091 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 21 14:09:04.626750 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 21 14:09:04.655649 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 21 14:09:04.689123 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (881) Mar 21 14:09:04.697794 kernel: BTRFS info (device vda6): first mount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 14:09:04.697861 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 21 14:09:04.704389 kernel: BTRFS info (device vda6): using free space tree Mar 21 14:09:04.714140 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 14:09:04.718824 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 21 14:09:04.762073 ignition[899]: INFO : Ignition 2.20.0 Mar 21 14:09:04.762073 ignition[899]: INFO : Stage: files Mar 21 14:09:04.764852 ignition[899]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 14:09:04.764852 ignition[899]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 14:09:04.764852 ignition[899]: DEBUG : files: compiled without relabeling support, skipping Mar 21 14:09:04.770275 ignition[899]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 21 14:09:04.770275 ignition[899]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 21 14:09:04.775899 ignition[899]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 21 14:09:04.775899 ignition[899]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 21 14:09:04.780494 ignition[899]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 21 14:09:04.776553 unknown[899]: wrote ssh authorized keys file for user: core Mar 21 14:09:04.784719 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 21 14:09:04.784719 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 21 14:09:05.298555 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 21 14:09:07.746722 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 21 14:09:07.748222 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 21 14:09:07.755392 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 21 14:09:07.755392 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 21 14:09:07.755392 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 21 14:09:07.755392 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 21 14:09:07.755392 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 21 14:09:07.755392 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Mar 21 14:09:08.486450 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 21 14:09:10.105897 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 21 14:09:10.105897 ignition[899]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 21 14:09:10.108570 ignition[899]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 21 14:09:10.110452 ignition[899]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 21 14:09:10.110452 ignition[899]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 21 14:09:10.110452 ignition[899]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 21 14:09:10.110452 ignition[899]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 21 14:09:10.110452 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 21 14:09:10.110452 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 21 14:09:10.110452 ignition[899]: INFO : files: files passed Mar 21 14:09:10.110452 ignition[899]: INFO : Ignition finished successfully Mar 21 14:09:10.110813 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 21 14:09:10.117204 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 21 14:09:10.122829 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 21 14:09:10.131321 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 21 14:09:10.132346 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 21 14:09:10.137414 initrd-setup-root-after-ignition[928]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 21 14:09:10.137414 initrd-setup-root-after-ignition[928]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 21 14:09:10.139671 initrd-setup-root-after-ignition[932]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 21 14:09:10.141912 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 21 14:09:10.142639 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 21 14:09:10.146160 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 21 14:09:10.191692 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 21 14:09:10.191899 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 21 14:09:10.193806 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 21 14:09:10.195545 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 21 14:09:10.201186 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 21 14:09:10.202781 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 21 14:09:10.230467 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 21 14:09:10.234394 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 21 14:09:10.258928 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 21 14:09:10.260460 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 14:09:10.262409 systemd[1]: Stopped target timers.target - Timer Units. Mar 21 14:09:10.264567 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 21 14:09:10.264854 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 21 14:09:10.267242 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 21 14:09:10.268955 systemd[1]: Stopped target basic.target - Basic System. Mar 21 14:09:10.271342 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 21 14:09:10.273569 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 21 14:09:10.275675 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 21 14:09:10.278037 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 21 14:09:10.280394 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 21 14:09:10.282918 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 21 14:09:10.285287 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 21 14:09:10.287688 systemd[1]: Stopped target swap.target - Swaps. Mar 21 14:09:10.289833 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 21 14:09:10.290156 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 21 14:09:10.292886 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 21 14:09:10.295302 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 14:09:10.297604 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 21 14:09:10.297840 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 14:09:10.300151 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 21 14:09:10.300454 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 21 14:09:10.303248 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 21 14:09:10.303665 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 21 14:09:10.305779 systemd[1]: ignition-files.service: Deactivated successfully. Mar 21 14:09:10.305954 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 21 14:09:10.309276 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 21 14:09:10.310196 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 21 14:09:10.311166 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 14:09:10.314071 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 21 14:09:10.317661 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 21 14:09:10.317802 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 14:09:10.318421 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 21 14:09:10.318533 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 21 14:09:10.325933 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 21 14:09:10.326358 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 21 14:09:10.347079 ignition[952]: INFO : Ignition 2.20.0 Mar 21 14:09:10.347079 ignition[952]: INFO : Stage: umount Mar 21 14:09:10.347079 ignition[952]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 14:09:10.347079 ignition[952]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 14:09:10.352463 ignition[952]: INFO : umount: umount passed Mar 21 14:09:10.352463 ignition[952]: INFO : Ignition finished successfully Mar 21 14:09:10.349296 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 21 14:09:10.349396 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 21 14:09:10.350470 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 21 14:09:10.350541 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 21 14:09:10.354654 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 21 14:09:10.354716 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 21 14:09:10.355216 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 21 14:09:10.355257 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 21 14:09:10.355742 systemd[1]: Stopped target network.target - Network. Mar 21 14:09:10.356903 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 21 14:09:10.356963 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 21 14:09:10.357837 systemd[1]: Stopped target paths.target - Path Units. Mar 21 14:09:10.358718 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 21 14:09:10.362091 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 14:09:10.363113 systemd[1]: Stopped target slices.target - Slice Units. Mar 21 14:09:10.364038 systemd[1]: Stopped target sockets.target - Socket Units. Mar 21 14:09:10.365176 systemd[1]: iscsid.socket: Deactivated successfully. Mar 21 14:09:10.365211 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 21 14:09:10.366345 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 21 14:09:10.366379 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 21 14:09:10.367299 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 21 14:09:10.367341 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 21 14:09:10.368243 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 21 14:09:10.368284 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 21 14:09:10.369244 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 21 14:09:10.370179 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 21 14:09:10.372176 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 21 14:09:10.372726 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 21 14:09:10.372809 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 21 14:09:10.374008 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 21 14:09:10.375127 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 21 14:09:10.377032 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 21 14:09:10.377341 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 21 14:09:10.380656 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 21 14:09:10.380853 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 21 14:09:10.380950 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 21 14:09:10.382969 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 21 14:09:10.383630 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 21 14:09:10.383917 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 21 14:09:10.386135 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 21 14:09:10.390714 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 21 14:09:10.390775 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 21 14:09:10.392112 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 21 14:09:10.392160 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 21 14:09:10.393744 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 21 14:09:10.393785 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 21 14:09:10.394452 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 21 14:09:10.394492 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 14:09:10.395698 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 14:09:10.397134 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 21 14:09:10.397193 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 21 14:09:10.407243 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 21 14:09:10.408123 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 14:09:10.409010 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 21 14:09:10.409073 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 21 14:09:10.409962 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 21 14:09:10.409992 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 14:09:10.411127 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 21 14:09:10.411174 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 21 14:09:10.412848 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 21 14:09:10.412892 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 21 14:09:10.414019 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 21 14:09:10.414078 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 14:09:10.417150 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 21 14:09:10.418010 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 21 14:09:10.418095 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 14:09:10.420692 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 21 14:09:10.420739 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 14:09:10.423096 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 21 14:09:10.423152 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 21 14:09:10.426212 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 21 14:09:10.426303 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 21 14:09:10.431569 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 21 14:09:10.431669 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 21 14:09:10.433184 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 21 14:09:10.434881 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 21 14:09:10.450814 systemd[1]: Switching root. Mar 21 14:09:10.489162 systemd-journald[185]: Journal stopped Mar 21 14:09:12.564831 systemd-journald[185]: Received SIGTERM from PID 1 (systemd). Mar 21 14:09:12.564879 kernel: SELinux: policy capability network_peer_controls=1 Mar 21 14:09:12.564895 kernel: SELinux: policy capability open_perms=1 Mar 21 14:09:12.564908 kernel: SELinux: policy capability extended_socket_class=1 Mar 21 14:09:12.564919 kernel: SELinux: policy capability always_check_network=0 Mar 21 14:09:12.564930 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 21 14:09:12.564942 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 21 14:09:12.564953 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 21 14:09:12.564967 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 21 14:09:12.564980 kernel: audit: type=1403 audit(1742566151.265:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 21 14:09:12.564995 systemd[1]: Successfully loaded SELinux policy in 80.969ms. Mar 21 14:09:12.565013 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 25.602ms. Mar 21 14:09:12.565027 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 21 14:09:12.565039 systemd[1]: Detected virtualization kvm. Mar 21 14:09:12.565067 systemd[1]: Detected architecture x86-64. Mar 21 14:09:12.565081 systemd[1]: Detected first boot. Mar 21 14:09:12.565093 systemd[1]: Hostname set to . Mar 21 14:09:12.565108 systemd[1]: Initializing machine ID from VM UUID. Mar 21 14:09:12.565121 zram_generator::config[997]: No configuration found. Mar 21 14:09:12.565133 kernel: Guest personality initialized and is inactive Mar 21 14:09:12.565145 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 21 14:09:12.565156 kernel: Initialized host personality Mar 21 14:09:12.565167 kernel: NET: Registered PF_VSOCK protocol family Mar 21 14:09:12.565182 systemd[1]: Populated /etc with preset unit settings. Mar 21 14:09:12.565195 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 21 14:09:12.565209 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 21 14:09:12.565221 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 21 14:09:12.565234 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 21 14:09:12.565246 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 21 14:09:12.565258 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 21 14:09:12.565270 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 21 14:09:12.565283 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 21 14:09:12.565296 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 21 14:09:12.565308 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 21 14:09:12.565322 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 21 14:09:12.565335 systemd[1]: Created slice user.slice - User and Session Slice. Mar 21 14:09:12.565347 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 14:09:12.565360 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 14:09:12.565372 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 21 14:09:12.565384 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 21 14:09:12.565398 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 21 14:09:12.565412 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 21 14:09:12.565425 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 21 14:09:12.565437 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 14:09:12.565449 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 21 14:09:12.565462 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 21 14:09:12.565474 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 21 14:09:12.565487 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 21 14:09:12.565499 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 14:09:12.565513 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 21 14:09:12.565526 systemd[1]: Reached target slices.target - Slice Units. Mar 21 14:09:12.565538 systemd[1]: Reached target swap.target - Swaps. Mar 21 14:09:12.565550 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 21 14:09:12.565562 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 21 14:09:12.565574 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 21 14:09:12.565587 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 21 14:09:12.565599 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 21 14:09:12.565611 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 14:09:12.565624 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 21 14:09:12.565638 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 21 14:09:12.565650 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 21 14:09:12.565663 systemd[1]: Mounting media.mount - External Media Directory... Mar 21 14:09:12.565676 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 14:09:12.565688 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 21 14:09:12.565700 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 21 14:09:12.565712 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 21 14:09:12.565725 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 21 14:09:12.565740 systemd[1]: Reached target machines.target - Containers. Mar 21 14:09:12.565752 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 21 14:09:12.565767 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 14:09:12.565779 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 21 14:09:12.565792 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 21 14:09:12.565804 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 14:09:12.565816 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 21 14:09:12.565829 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 14:09:12.565843 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 21 14:09:12.565855 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 14:09:12.565868 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 21 14:09:12.565880 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 21 14:09:12.565892 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 21 14:09:12.565904 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 21 14:09:12.565916 systemd[1]: Stopped systemd-fsck-usr.service. Mar 21 14:09:12.565930 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 14:09:12.565944 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 21 14:09:12.565956 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 21 14:09:12.565969 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 21 14:09:12.565981 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 21 14:09:12.565993 kernel: loop: module loaded Mar 21 14:09:12.566004 kernel: fuse: init (API version 7.39) Mar 21 14:09:12.566017 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 21 14:09:12.566029 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 21 14:09:12.566041 systemd[1]: verity-setup.service: Deactivated successfully. Mar 21 14:09:12.566065 systemd[1]: Stopped verity-setup.service. Mar 21 14:09:12.566081 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 14:09:12.566094 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 21 14:09:12.566107 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 21 14:09:12.566120 systemd[1]: Mounted media.mount - External Media Directory. Mar 21 14:09:12.566134 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 21 14:09:12.566147 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 21 14:09:12.566159 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 21 14:09:12.566172 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 14:09:12.566184 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 21 14:09:12.566196 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 21 14:09:12.566211 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 14:09:12.566223 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 14:09:12.566235 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 14:09:12.566247 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 14:09:12.566259 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 21 14:09:12.566272 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 21 14:09:12.566284 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 14:09:12.566296 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 14:09:12.566310 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 21 14:09:12.566323 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 21 14:09:12.566336 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 21 14:09:12.566349 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 21 14:09:12.566361 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 21 14:09:12.566373 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 21 14:09:12.566385 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 21 14:09:12.566415 systemd-journald[1080]: Collecting audit messages is disabled. Mar 21 14:09:12.566442 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 21 14:09:12.566455 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 21 14:09:12.566467 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 21 14:09:12.566480 systemd-journald[1080]: Journal started Mar 21 14:09:12.566506 systemd-journald[1080]: Runtime Journal (/run/log/journal/0f2be3ad65354ba5b65a86cdb7f3b982) is 8M, max 78.2M, 70.2M free. Mar 21 14:09:12.151509 systemd[1]: Queued start job for default target multi-user.target. Mar 21 14:09:12.160583 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 21 14:09:12.161074 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 21 14:09:12.574515 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 21 14:09:12.581306 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 21 14:09:12.581353 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 14:09:12.589095 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 21 14:09:12.594816 kernel: ACPI: bus type drm_connector registered Mar 21 14:09:12.594850 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 21 14:09:12.606444 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 21 14:09:12.606500 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 21 14:09:12.613597 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 21 14:09:12.621070 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 21 14:09:12.626080 systemd[1]: Started systemd-journald.service - Journal Service. Mar 21 14:09:12.634024 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 21 14:09:12.635372 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 21 14:09:12.635763 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 21 14:09:12.638216 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 14:09:12.638971 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 21 14:09:12.639667 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 21 14:09:12.640490 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 21 14:09:12.644237 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 21 14:09:12.659076 kernel: loop0: detected capacity change from 0 to 8 Mar 21 14:09:12.665830 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 21 14:09:12.668731 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 21 14:09:12.681281 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 21 14:09:12.681479 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 21 14:09:12.689239 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 21 14:09:12.692086 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 21 14:09:12.694088 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 21 14:09:12.704228 systemd-journald[1080]: Time spent on flushing to /var/log/journal/0f2be3ad65354ba5b65a86cdb7f3b982 is 23.188ms for 965 entries. Mar 21 14:09:12.704228 systemd-journald[1080]: System Journal (/var/log/journal/0f2be3ad65354ba5b65a86cdb7f3b982) is 8M, max 584.8M, 576.8M free. Mar 21 14:09:12.738984 systemd-journald[1080]: Received client request to flush runtime journal. Mar 21 14:09:12.739086 kernel: loop1: detected capacity change from 0 to 109808 Mar 21 14:09:12.716235 udevadm[1148]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 21 14:09:12.742112 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 21 14:09:12.790981 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 21 14:09:13.044128 kernel: loop2: detected capacity change from 0 to 205544 Mar 21 14:09:13.164185 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 21 14:09:13.176380 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 21 14:09:13.181613 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 21 14:09:13.309312 systemd-tmpfiles[1157]: ACLs are not supported, ignoring. Mar 21 14:09:13.309373 systemd-tmpfiles[1157]: ACLs are not supported, ignoring. Mar 21 14:09:13.321169 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 14:09:13.355105 kernel: loop3: detected capacity change from 0 to 151640 Mar 21 14:09:13.420142 kernel: loop4: detected capacity change from 0 to 8 Mar 21 14:09:13.425118 kernel: loop5: detected capacity change from 0 to 109808 Mar 21 14:09:13.500117 kernel: loop6: detected capacity change from 0 to 205544 Mar 21 14:09:13.546118 kernel: loop7: detected capacity change from 0 to 151640 Mar 21 14:09:13.628364 (sd-merge)[1162]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 21 14:09:13.629234 (sd-merge)[1162]: Merged extensions into '/usr'. Mar 21 14:09:13.640388 systemd[1]: Reload requested from client PID 1117 ('systemd-sysext') (unit systemd-sysext.service)... Mar 21 14:09:13.640407 systemd[1]: Reloading... Mar 21 14:09:13.742087 zram_generator::config[1187]: No configuration found. Mar 21 14:09:13.964157 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 14:09:14.046171 systemd[1]: Reloading finished in 405 ms. Mar 21 14:09:14.065151 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 21 14:09:14.073759 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 14:09:14.079492 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 21 14:09:14.083383 systemd[1]: Starting ensure-sysext.service... Mar 21 14:09:14.087217 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 21 14:09:14.113533 systemd[1]: Reload requested from client PID 1247 ('systemctl') (unit ensure-sysext.service)... Mar 21 14:09:14.113548 systemd[1]: Reloading... Mar 21 14:09:14.130653 ldconfig[1113]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 21 14:09:14.134508 systemd-tmpfiles[1248]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 21 14:09:14.134980 systemd-tmpfiles[1248]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 21 14:09:14.135850 systemd-tmpfiles[1248]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 21 14:09:14.136332 systemd-tmpfiles[1248]: ACLs are not supported, ignoring. Mar 21 14:09:14.136460 systemd-tmpfiles[1248]: ACLs are not supported, ignoring. Mar 21 14:09:14.143640 systemd-tmpfiles[1248]: Detected autofs mount point /boot during canonicalization of boot. Mar 21 14:09:14.143648 systemd-tmpfiles[1248]: Skipping /boot Mar 21 14:09:14.146430 systemd-udevd[1245]: Using default interface naming scheme 'v255'. Mar 21 14:09:14.160395 systemd-tmpfiles[1248]: Detected autofs mount point /boot during canonicalization of boot. Mar 21 14:09:14.160546 systemd-tmpfiles[1248]: Skipping /boot Mar 21 14:09:14.207075 zram_generator::config[1278]: No configuration found. Mar 21 14:09:14.344100 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1290) Mar 21 14:09:14.368067 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 21 14:09:14.378136 kernel: ACPI: button: Power Button [PWRF] Mar 21 14:09:14.413076 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Mar 21 14:09:14.441381 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 21 14:09:14.448302 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 14:09:14.470072 kernel: mousedev: PS/2 mouse device common for all mice Mar 21 14:09:14.491794 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Mar 21 14:09:14.491882 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Mar 21 14:09:14.496164 kernel: Console: switching to colour dummy device 80x25 Mar 21 14:09:14.497833 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 21 14:09:14.497869 kernel: [drm] features: -context_init Mar 21 14:09:14.499148 kernel: [drm] number of scanouts: 1 Mar 21 14:09:14.500090 kernel: [drm] number of cap sets: 0 Mar 21 14:09:14.504076 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Mar 21 14:09:14.516900 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 21 14:09:14.516973 kernel: Console: switching to colour frame buffer device 160x50 Mar 21 14:09:14.524104 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 21 14:09:14.555435 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 21 14:09:14.557437 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 21 14:09:14.557631 systemd[1]: Reloading finished in 443 ms. Mar 21 14:09:14.570218 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 14:09:14.572505 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 21 14:09:14.578185 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 14:09:14.622578 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 14:09:14.623825 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 21 14:09:14.629158 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 21 14:09:14.631352 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 14:09:14.634251 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 14:09:14.636955 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 21 14:09:14.643185 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 14:09:14.648231 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 14:09:14.650428 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 14:09:14.652459 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 21 14:09:14.652548 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 14:09:14.656273 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 21 14:09:14.659364 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 21 14:09:14.664093 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 21 14:09:14.671328 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 21 14:09:14.685510 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 14:09:14.685618 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 14:09:14.689845 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 21 14:09:14.691708 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 14:09:14.692961 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 14:09:14.695341 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 21 14:09:14.695540 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 21 14:09:14.698188 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 14:09:14.699632 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 14:09:14.700395 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 14:09:14.700580 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 14:09:14.706655 systemd[1]: Finished ensure-sysext.service. Mar 21 14:09:14.716262 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 21 14:09:14.723036 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 21 14:09:14.726395 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 21 14:09:14.726471 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 21 14:09:14.731228 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 21 14:09:14.735827 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 21 14:09:14.745186 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 21 14:09:14.752279 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 21 14:09:14.760335 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 21 14:09:14.784428 lvm[1396]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 21 14:09:14.789781 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 21 14:09:14.809084 augenrules[1416]: No rules Mar 21 14:09:14.810697 systemd[1]: audit-rules.service: Deactivated successfully. Mar 21 14:09:14.811325 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 21 14:09:14.815672 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 21 14:09:14.819638 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 21 14:09:14.825488 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 21 14:09:14.830378 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 21 14:09:14.831476 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 21 14:09:14.835277 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 21 14:09:14.844586 lvm[1423]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 21 14:09:14.875031 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 21 14:09:14.895333 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 14:09:14.944473 systemd-networkd[1379]: lo: Link UP Mar 21 14:09:14.944484 systemd-networkd[1379]: lo: Gained carrier Mar 21 14:09:14.945758 systemd-networkd[1379]: Enumeration completed Mar 21 14:09:14.945839 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 21 14:09:14.948996 systemd-networkd[1379]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 14:09:14.949008 systemd-networkd[1379]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 21 14:09:14.949596 systemd-networkd[1379]: eth0: Link UP Mar 21 14:09:14.949605 systemd-networkd[1379]: eth0: Gained carrier Mar 21 14:09:14.949619 systemd-networkd[1379]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 14:09:14.952202 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 21 14:09:14.956429 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 21 14:09:14.964443 systemd-resolved[1380]: Positive Trust Anchors: Mar 21 14:09:14.964461 systemd-resolved[1380]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 21 14:09:14.964509 systemd-resolved[1380]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 21 14:09:14.966229 systemd-networkd[1379]: eth0: DHCPv4 address 172.24.4.61/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 21 14:09:14.974168 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 21 14:09:14.976147 systemd[1]: Reached target time-set.target - System Time Set. Mar 21 14:09:14.985801 systemd-resolved[1380]: Using system hostname 'ci-9999-0-3-a-8593155e6d.novalocal'. Mar 21 14:09:14.988255 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 21 14:09:14.990375 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 21 14:09:14.992195 systemd[1]: Reached target network.target - Network. Mar 21 14:09:14.993829 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 21 14:09:14.995318 systemd[1]: Reached target sysinit.target - System Initialization. Mar 21 14:09:14.996718 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 21 14:09:14.997686 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 21 14:09:14.998900 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 21 14:09:15.000451 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 21 14:09:15.001826 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 21 14:09:15.003312 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 21 14:09:15.003350 systemd[1]: Reached target paths.target - Path Units. Mar 21 14:09:15.004677 systemd[1]: Reached target timers.target - Timer Units. Mar 21 14:09:15.006623 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 21 14:09:15.009640 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 21 14:09:15.013938 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 21 14:09:15.018639 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 21 14:09:15.020189 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 21 14:09:15.032628 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 21 14:09:15.035349 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 21 14:09:15.039120 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 21 14:09:15.039915 systemd[1]: Reached target sockets.target - Socket Units. Mar 21 14:09:15.041821 systemd[1]: Reached target basic.target - Basic System. Mar 21 14:09:15.044291 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 21 14:09:15.044330 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 21 14:09:15.046189 systemd[1]: Starting containerd.service - containerd container runtime... Mar 21 14:09:15.052174 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 21 14:09:15.057558 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 21 14:09:15.062182 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 21 14:09:15.066825 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 21 14:09:15.067448 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 21 14:09:15.070733 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 21 14:09:15.075400 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 21 14:09:15.084238 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 21 14:09:15.089831 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 21 14:09:15.097187 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 21 14:09:15.098611 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 21 14:09:15.100495 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 21 14:09:15.104622 systemd[1]: Starting update-engine.service - Update Engine... Mar 21 14:09:15.108608 extend-filesystems[1447]: Found loop4 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found loop5 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found loop6 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found loop7 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda1 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda2 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda3 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found usr Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda4 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda6 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda7 Mar 21 14:09:15.111902 extend-filesystems[1447]: Found vda9 Mar 21 14:09:15.111902 extend-filesystems[1447]: Checking size of /dev/vda9 Mar 21 14:09:15.152169 jq[1446]: false Mar 21 14:09:15.128264 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 21 14:09:15.115085 dbus-daemon[1445]: [system] SELinux support is enabled Mar 21 14:09:15.137678 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 21 14:09:15.157569 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 21 14:09:15.157766 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 21 14:09:15.158012 systemd[1]: motdgen.service: Deactivated successfully. Mar 21 14:09:15.158197 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 21 14:09:15.161618 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 21 14:09:15.163107 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 21 14:09:15.171414 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 21 14:09:15.171449 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 21 14:09:15.176736 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 21 14:09:15.176770 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 21 14:09:15.731424 update_engine[1457]: I20250321 14:09:15.181933 1457 main.cc:92] Flatcar Update Engine starting Mar 21 14:09:15.731429 systemd-timesyncd[1399]: Contacted time server 45.63.54.13:123 (0.flatcar.pool.ntp.org). Mar 21 14:09:15.731482 systemd-timesyncd[1399]: Initial clock synchronization to Fri 2025-03-21 14:09:15.731286 UTC. Mar 21 14:09:15.731924 systemd-resolved[1380]: Clock change detected. Flushing caches. Mar 21 14:09:15.733490 update_engine[1457]: I20250321 14:09:15.733358 1457 update_check_scheduler.cc:74] Next update check in 11m24s Mar 21 14:09:15.737729 extend-filesystems[1447]: Resized partition /dev/vda9 Mar 21 14:09:15.745308 systemd[1]: Started update-engine.service - Update Engine. Mar 21 14:09:15.747819 extend-filesystems[1479]: resize2fs 1.47.2 (1-Jan-2025) Mar 21 14:09:15.759419 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1282) Mar 21 14:09:15.759458 jq[1461]: true Mar 21 14:09:15.766810 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks Mar 21 14:09:15.767764 (ntainerd)[1478]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 21 14:09:15.770254 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 21 14:09:15.797687 kernel: EXT4-fs (vda9): resized filesystem to 2014203 Mar 21 14:09:15.802692 jq[1483]: true Mar 21 14:09:15.876523 tar[1470]: linux-amd64/helm Mar 21 14:09:15.877281 extend-filesystems[1479]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 21 14:09:15.877281 extend-filesystems[1479]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 21 14:09:15.877281 extend-filesystems[1479]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. Mar 21 14:09:15.891417 extend-filesystems[1447]: Resized filesystem in /dev/vda9 Mar 21 14:09:15.878684 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 21 14:09:15.878881 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 21 14:09:15.910345 systemd-logind[1456]: New seat seat0. Mar 21 14:09:15.913180 systemd-logind[1456]: Watching system buttons on /dev/input/event1 (Power Button) Mar 21 14:09:15.913196 systemd-logind[1456]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 21 14:09:15.913360 systemd[1]: Started systemd-logind.service - User Login Management. Mar 21 14:09:15.927168 bash[1500]: Updated "/home/core/.ssh/authorized_keys" Mar 21 14:09:15.928707 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 21 14:09:15.935499 systemd[1]: Starting sshkeys.service... Mar 21 14:09:15.972154 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 21 14:09:15.978580 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 21 14:09:16.051416 locksmithd[1481]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 21 14:09:16.066391 sshd_keygen[1469]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 21 14:09:16.104300 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 21 14:09:16.114456 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 21 14:09:16.137815 systemd[1]: issuegen.service: Deactivated successfully. Mar 21 14:09:16.138011 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 21 14:09:16.144904 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 21 14:09:16.168368 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 21 14:09:16.175543 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 21 14:09:16.182093 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 21 14:09:16.183002 systemd[1]: Reached target getty.target - Login Prompts. Mar 21 14:09:16.342170 containerd[1478]: time="2025-03-21T14:09:16Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 21 14:09:16.342929 containerd[1478]: time="2025-03-21T14:09:16.342614788Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 21 14:09:16.367148 containerd[1478]: time="2025-03-21T14:09:16.366956411Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.833µs" Mar 21 14:09:16.371170 containerd[1478]: time="2025-03-21T14:09:16.371135118Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 21 14:09:16.371211 containerd[1478]: time="2025-03-21T14:09:16.371170564Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 21 14:09:16.371361 containerd[1478]: time="2025-03-21T14:09:16.371329162Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 21 14:09:16.371361 containerd[1478]: time="2025-03-21T14:09:16.371355181Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 21 14:09:16.371434 containerd[1478]: time="2025-03-21T14:09:16.371388573Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 21 14:09:16.371483 containerd[1478]: time="2025-03-21T14:09:16.371450840Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 21 14:09:16.371483 containerd[1478]: time="2025-03-21T14:09:16.371470076Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 21 14:09:16.371753 containerd[1478]: time="2025-03-21T14:09:16.371715025Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 21 14:09:16.371753 containerd[1478]: time="2025-03-21T14:09:16.371740473Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 21 14:09:16.371810 containerd[1478]: time="2025-03-21T14:09:16.371753808Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 21 14:09:16.371810 containerd[1478]: time="2025-03-21T14:09:16.371765170Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 21 14:09:16.371858 containerd[1478]: time="2025-03-21T14:09:16.371845420Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 21 14:09:16.372071 containerd[1478]: time="2025-03-21T14:09:16.372036859Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 21 14:09:16.372102 containerd[1478]: time="2025-03-21T14:09:16.372074961Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 21 14:09:16.372102 containerd[1478]: time="2025-03-21T14:09:16.372088286Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 21 14:09:16.372177 containerd[1478]: time="2025-03-21T14:09:16.372140984Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 21 14:09:16.373207 containerd[1478]: time="2025-03-21T14:09:16.372766498Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 21 14:09:16.373207 containerd[1478]: time="2025-03-21T14:09:16.373003512Z" level=info msg="metadata content store policy set" policy=shared Mar 21 14:09:16.386737 containerd[1478]: time="2025-03-21T14:09:16.386698687Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.386884596Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.386909843Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.386926915Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.386961941Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.386985195Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387004361Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387019449Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387034086Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387047922Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387066317Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387094890Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387228792Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387252767Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 21 14:09:16.387775 containerd[1478]: time="2025-03-21T14:09:16.387266552Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387278355Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387290577Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387303241Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387315574Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387326795Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387339820Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387354177Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387367041Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387426262Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387443244Z" level=info msg="Start snapshots syncer" Mar 21 14:09:16.388225 containerd[1478]: time="2025-03-21T14:09:16.387469693Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 21 14:09:16.388534 containerd[1478]: time="2025-03-21T14:09:16.387727016Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 21 14:09:16.388534 containerd[1478]: time="2025-03-21T14:09:16.387784504Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.387851620Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.387937771Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.387959783Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.387971515Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.387984168Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.387997503Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388008754Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388029433Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388052687Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388066563Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388078225Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388102941Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388137336Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 21 14:09:16.388662 containerd[1478]: time="2025-03-21T14:09:16.388149108Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388162653Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388172512Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388184213Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388195475Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388211675Z" level=info msg="runtime interface created" Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388217285Z" level=info msg="created NRI interface" Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388226883Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388242513Z" level=info msg="Connect containerd service" Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.388267440Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 21 14:09:16.390227 containerd[1478]: time="2025-03-21T14:09:16.389039297Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 21 14:09:16.479587 tar[1470]: linux-amd64/LICENSE Mar 21 14:09:16.479587 tar[1470]: linux-amd64/README.md Mar 21 14:09:16.492788 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 21 14:09:16.545394 containerd[1478]: time="2025-03-21T14:09:16.545325903Z" level=info msg="Start subscribing containerd event" Mar 21 14:09:16.545489 containerd[1478]: time="2025-03-21T14:09:16.545405171Z" level=info msg="Start recovering state" Mar 21 14:09:16.545546 containerd[1478]: time="2025-03-21T14:09:16.545517642Z" level=info msg="Start event monitor" Mar 21 14:09:16.545546 containerd[1478]: time="2025-03-21T14:09:16.545541297Z" level=info msg="Start cni network conf syncer for default" Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545552778Z" level=info msg="Start streaming server" Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545562536Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545570091Z" level=info msg="runtime interface starting up..." Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545576583Z" level=info msg="starting plugins..." Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545589517Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545351020Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545676620Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 21 14:09:16.547171 containerd[1478]: time="2025-03-21T14:09:16.545744878Z" level=info msg="containerd successfully booted in 0.203981s" Mar 21 14:09:16.545907 systemd[1]: Started containerd.service - containerd container runtime. Mar 21 14:09:17.103374 systemd-networkd[1379]: eth0: Gained IPv6LL Mar 21 14:09:17.107622 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 21 14:09:17.112416 systemd[1]: Reached target network-online.target - Network is Online. Mar 21 14:09:17.120791 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:09:17.129288 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 21 14:09:17.193494 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 21 14:09:19.135568 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:09:19.149400 (kubelet)[1573]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 14:09:19.854191 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 21 14:09:19.865890 systemd[1]: Started sshd@0-172.24.4.61:22-172.24.4.1:35606.service - OpenSSH per-connection server daemon (172.24.4.1:35606). Mar 21 14:09:20.435035 kubelet[1573]: E0321 14:09:20.434887 1573 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 14:09:20.437879 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 14:09:20.438238 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 14:09:20.438790 systemd[1]: kubelet.service: Consumed 2.165s CPU time, 238.7M memory peak. Mar 21 14:09:21.243523 login[1533]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 21 14:09:21.256317 login[1534]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 21 14:09:21.277959 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 21 14:09:21.281003 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 21 14:09:21.287198 systemd-logind[1456]: New session 1 of user core. Mar 21 14:09:21.294893 systemd-logind[1456]: New session 2 of user core. Mar 21 14:09:21.331310 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 21 14:09:21.337498 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 21 14:09:21.362408 (systemd)[1589]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 21 14:09:21.365921 systemd-logind[1456]: New session c1 of user core. Mar 21 14:09:21.537968 sshd[1579]: Accepted publickey for core from 172.24.4.1 port 35606 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:09:21.539039 sshd-session[1579]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:09:21.552363 systemd-logind[1456]: New session 3 of user core. Mar 21 14:09:21.700590 systemd[1589]: Queued start job for default target default.target. Mar 21 14:09:21.709004 systemd[1589]: Created slice app.slice - User Application Slice. Mar 21 14:09:21.709030 systemd[1589]: Reached target paths.target - Paths. Mar 21 14:09:21.709070 systemd[1589]: Reached target timers.target - Timers. Mar 21 14:09:21.711306 systemd[1589]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 21 14:09:21.719946 systemd[1589]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 21 14:09:21.720945 systemd[1589]: Reached target sockets.target - Sockets. Mar 21 14:09:21.720993 systemd[1589]: Reached target basic.target - Basic System. Mar 21 14:09:21.721030 systemd[1589]: Reached target default.target - Main User Target. Mar 21 14:09:21.721057 systemd[1589]: Startup finished in 348ms. Mar 21 14:09:21.721693 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 21 14:09:21.733504 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 21 14:09:21.735677 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 21 14:09:21.737611 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 21 14:09:22.260656 systemd[1]: Started sshd@1-172.24.4.61:22-172.24.4.1:35612.service - OpenSSH per-connection server daemon (172.24.4.1:35612). Mar 21 14:09:22.730581 coreos-metadata[1444]: Mar 21 14:09:22.730 WARN failed to locate config-drive, using the metadata service API instead Mar 21 14:09:22.784426 coreos-metadata[1444]: Mar 21 14:09:22.784 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 21 14:09:23.096744 coreos-metadata[1507]: Mar 21 14:09:23.096 WARN failed to locate config-drive, using the metadata service API instead Mar 21 14:09:23.138914 coreos-metadata[1507]: Mar 21 14:09:23.138 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 21 14:09:23.221368 coreos-metadata[1444]: Mar 21 14:09:23.221 INFO Fetch successful Mar 21 14:09:23.221368 coreos-metadata[1444]: Mar 21 14:09:23.221 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 21 14:09:23.235521 coreos-metadata[1444]: Mar 21 14:09:23.235 INFO Fetch successful Mar 21 14:09:23.235521 coreos-metadata[1444]: Mar 21 14:09:23.235 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 21 14:09:23.250217 coreos-metadata[1444]: Mar 21 14:09:23.250 INFO Fetch successful Mar 21 14:09:23.250217 coreos-metadata[1444]: Mar 21 14:09:23.250 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 21 14:09:23.264320 coreos-metadata[1444]: Mar 21 14:09:23.264 INFO Fetch successful Mar 21 14:09:23.264320 coreos-metadata[1444]: Mar 21 14:09:23.264 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 21 14:09:23.277997 coreos-metadata[1444]: Mar 21 14:09:23.277 INFO Fetch successful Mar 21 14:09:23.277997 coreos-metadata[1444]: Mar 21 14:09:23.277 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 21 14:09:23.291494 coreos-metadata[1444]: Mar 21 14:09:23.291 INFO Fetch successful Mar 21 14:09:23.342197 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 21 14:09:23.345199 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 21 14:09:23.416528 coreos-metadata[1507]: Mar 21 14:09:23.416 INFO Fetch successful Mar 21 14:09:23.416528 coreos-metadata[1507]: Mar 21 14:09:23.416 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 21 14:09:23.429772 coreos-metadata[1507]: Mar 21 14:09:23.429 INFO Fetch successful Mar 21 14:09:23.436699 unknown[1507]: wrote ssh authorized keys file for user: core Mar 21 14:09:23.476856 update-ssh-keys[1631]: Updated "/home/core/.ssh/authorized_keys" Mar 21 14:09:23.477984 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 21 14:09:23.481166 systemd[1]: Finished sshkeys.service. Mar 21 14:09:23.486649 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 21 14:09:23.486911 systemd[1]: Startup finished in 1.223s (kernel) + 17.442s (initrd) + 11.751s (userspace) = 30.417s. Mar 21 14:09:23.698104 sshd[1620]: Accepted publickey for core from 172.24.4.1 port 35612 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:09:23.700846 sshd-session[1620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:09:23.713195 systemd-logind[1456]: New session 4 of user core. Mar 21 14:09:23.722428 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 21 14:09:24.315967 sshd[1635]: Connection closed by 172.24.4.1 port 35612 Mar 21 14:09:24.317048 sshd-session[1620]: pam_unix(sshd:session): session closed for user core Mar 21 14:09:24.333360 systemd[1]: sshd@1-172.24.4.61:22-172.24.4.1:35612.service: Deactivated successfully. Mar 21 14:09:24.336596 systemd[1]: session-4.scope: Deactivated successfully. Mar 21 14:09:24.340403 systemd-logind[1456]: Session 4 logged out. Waiting for processes to exit. Mar 21 14:09:24.343747 systemd[1]: Started sshd@2-172.24.4.61:22-172.24.4.1:33330.service - OpenSSH per-connection server daemon (172.24.4.1:33330). Mar 21 14:09:24.347266 systemd-logind[1456]: Removed session 4. Mar 21 14:09:25.553404 sshd[1640]: Accepted publickey for core from 172.24.4.1 port 33330 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:09:25.556187 sshd-session[1640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:09:25.566955 systemd-logind[1456]: New session 5 of user core. Mar 21 14:09:25.576405 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 21 14:09:26.197194 sshd[1643]: Connection closed by 172.24.4.1 port 33330 Mar 21 14:09:26.197069 sshd-session[1640]: pam_unix(sshd:session): session closed for user core Mar 21 14:09:26.213049 systemd[1]: sshd@2-172.24.4.61:22-172.24.4.1:33330.service: Deactivated successfully. Mar 21 14:09:26.216299 systemd[1]: session-5.scope: Deactivated successfully. Mar 21 14:09:26.220403 systemd-logind[1456]: Session 5 logged out. Waiting for processes to exit. Mar 21 14:09:26.222767 systemd[1]: Started sshd@3-172.24.4.61:22-172.24.4.1:33340.service - OpenSSH per-connection server daemon (172.24.4.1:33340). Mar 21 14:09:26.226321 systemd-logind[1456]: Removed session 5. Mar 21 14:09:27.377476 sshd[1648]: Accepted publickey for core from 172.24.4.1 port 33340 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:09:27.380049 sshd-session[1648]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:09:27.392218 systemd-logind[1456]: New session 6 of user core. Mar 21 14:09:27.402418 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 21 14:09:27.996418 sshd[1651]: Connection closed by 172.24.4.1 port 33340 Mar 21 14:09:27.997512 sshd-session[1648]: pam_unix(sshd:session): session closed for user core Mar 21 14:09:28.015542 systemd[1]: sshd@3-172.24.4.61:22-172.24.4.1:33340.service: Deactivated successfully. Mar 21 14:09:28.018648 systemd[1]: session-6.scope: Deactivated successfully. Mar 21 14:09:28.020770 systemd-logind[1456]: Session 6 logged out. Waiting for processes to exit. Mar 21 14:09:28.025024 systemd[1]: Started sshd@4-172.24.4.61:22-172.24.4.1:33346.service - OpenSSH per-connection server daemon (172.24.4.1:33346). Mar 21 14:09:28.027366 systemd-logind[1456]: Removed session 6. Mar 21 14:09:29.162762 sshd[1656]: Accepted publickey for core from 172.24.4.1 port 33346 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:09:29.165424 sshd-session[1656]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:09:29.176684 systemd-logind[1456]: New session 7 of user core. Mar 21 14:09:29.187400 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 21 14:09:29.661036 sudo[1660]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 21 14:09:29.661726 sudo[1660]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 14:09:29.682185 sudo[1660]: pam_unix(sudo:session): session closed for user root Mar 21 14:09:29.906169 sshd[1659]: Connection closed by 172.24.4.1 port 33346 Mar 21 14:09:29.906357 sshd-session[1656]: pam_unix(sshd:session): session closed for user core Mar 21 14:09:29.927915 systemd[1]: sshd@4-172.24.4.61:22-172.24.4.1:33346.service: Deactivated successfully. Mar 21 14:09:29.931617 systemd[1]: session-7.scope: Deactivated successfully. Mar 21 14:09:29.933921 systemd-logind[1456]: Session 7 logged out. Waiting for processes to exit. Mar 21 14:09:29.938755 systemd[1]: Started sshd@5-172.24.4.61:22-172.24.4.1:33352.service - OpenSSH per-connection server daemon (172.24.4.1:33352). Mar 21 14:09:29.942473 systemd-logind[1456]: Removed session 7. Mar 21 14:09:30.606684 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 21 14:09:30.610683 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:09:30.954024 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:09:30.970967 (kubelet)[1676]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 14:09:31.039645 kubelet[1676]: E0321 14:09:31.039605 1676 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 14:09:31.047382 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 14:09:31.047700 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 14:09:31.048660 systemd[1]: kubelet.service: Consumed 279ms CPU time, 97.5M memory peak. Mar 21 14:09:31.233397 sshd[1665]: Accepted publickey for core from 172.24.4.1 port 33352 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:09:31.235915 sshd-session[1665]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:09:31.247656 systemd-logind[1456]: New session 8 of user core. Mar 21 14:09:31.254460 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 21 14:09:31.720100 sudo[1685]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 21 14:09:31.720773 sudo[1685]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 14:09:31.728052 sudo[1685]: pam_unix(sudo:session): session closed for user root Mar 21 14:09:31.740376 sudo[1684]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 21 14:09:31.741229 sudo[1684]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 14:09:31.763080 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 21 14:09:31.838567 augenrules[1707]: No rules Mar 21 14:09:31.839752 systemd[1]: audit-rules.service: Deactivated successfully. Mar 21 14:09:31.840203 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 21 14:09:31.842367 sudo[1684]: pam_unix(sudo:session): session closed for user root Mar 21 14:09:32.025232 sshd[1683]: Connection closed by 172.24.4.1 port 33352 Mar 21 14:09:32.026069 sshd-session[1665]: pam_unix(sshd:session): session closed for user core Mar 21 14:09:32.043374 systemd[1]: sshd@5-172.24.4.61:22-172.24.4.1:33352.service: Deactivated successfully. Mar 21 14:09:32.046992 systemd[1]: session-8.scope: Deactivated successfully. Mar 21 14:09:32.050563 systemd-logind[1456]: Session 8 logged out. Waiting for processes to exit. Mar 21 14:09:32.053645 systemd[1]: Started sshd@6-172.24.4.61:22-172.24.4.1:33364.service - OpenSSH per-connection server daemon (172.24.4.1:33364). Mar 21 14:09:32.057051 systemd-logind[1456]: Removed session 8. Mar 21 14:09:33.166157 sshd[1715]: Accepted publickey for core from 172.24.4.1 port 33364 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:09:33.168905 sshd-session[1715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:09:33.181422 systemd-logind[1456]: New session 9 of user core. Mar 21 14:09:33.187424 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 21 14:09:33.655256 sudo[1719]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 21 14:09:33.655874 sudo[1719]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 14:09:34.356230 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 21 14:09:34.373700 (dockerd)[1738]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 21 14:09:34.992004 dockerd[1738]: time="2025-03-21T14:09:34.991639619Z" level=info msg="Starting up" Mar 21 14:09:34.996855 dockerd[1738]: time="2025-03-21T14:09:34.996823882Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 21 14:09:35.085010 dockerd[1738]: time="2025-03-21T14:09:35.084972405Z" level=info msg="Loading containers: start." Mar 21 14:09:35.265227 kernel: Initializing XFRM netlink socket Mar 21 14:09:35.379679 systemd-networkd[1379]: docker0: Link UP Mar 21 14:09:35.428511 dockerd[1738]: time="2025-03-21T14:09:35.428431733Z" level=info msg="Loading containers: done." Mar 21 14:09:35.446562 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck481783525-merged.mount: Deactivated successfully. Mar 21 14:09:35.453503 dockerd[1738]: time="2025-03-21T14:09:35.453406413Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 21 14:09:35.453645 dockerd[1738]: time="2025-03-21T14:09:35.453583074Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 21 14:09:35.453864 dockerd[1738]: time="2025-03-21T14:09:35.453809759Z" level=info msg="Daemon has completed initialization" Mar 21 14:09:35.526804 dockerd[1738]: time="2025-03-21T14:09:35.526650622Z" level=info msg="API listen on /run/docker.sock" Mar 21 14:09:35.527479 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 21 14:09:37.149555 containerd[1478]: time="2025-03-21T14:09:37.149446676Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 21 14:09:37.924471 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3198842317.mount: Deactivated successfully. Mar 21 14:09:39.963255 containerd[1478]: time="2025-03-21T14:09:39.963183388Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:39.964654 containerd[1478]: time="2025-03-21T14:09:39.964438482Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=27959276" Mar 21 14:09:39.966021 containerd[1478]: time="2025-03-21T14:09:39.965967179Z" level=info msg="ImageCreate event name:\"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:39.969421 containerd[1478]: time="2025-03-21T14:09:39.969377816Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:39.970456 containerd[1478]: time="2025-03-21T14:09:39.970307009Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"27956068\" in 2.820791855s" Mar 21 14:09:39.970456 containerd[1478]: time="2025-03-21T14:09:39.970343437Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\"" Mar 21 14:09:39.972511 containerd[1478]: time="2025-03-21T14:09:39.972414020Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 21 14:09:41.105380 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 21 14:09:41.107187 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:09:41.249011 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:09:41.252570 (kubelet)[1998]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 14:09:41.628035 kubelet[1998]: E0321 14:09:41.627946 1998 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 14:09:41.631303 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 14:09:41.631654 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 14:09:41.632275 systemd[1]: kubelet.service: Consumed 220ms CPU time, 97.7M memory peak. Mar 21 14:09:42.421356 containerd[1478]: time="2025-03-21T14:09:42.420682267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:42.423405 containerd[1478]: time="2025-03-21T14:09:42.423267045Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=24713784" Mar 21 14:09:42.425198 containerd[1478]: time="2025-03-21T14:09:42.425007860Z" level=info msg="ImageCreate event name:\"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:42.431104 containerd[1478]: time="2025-03-21T14:09:42.431006911Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:42.433788 containerd[1478]: time="2025-03-21T14:09:42.433717354Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"26201384\" in 2.461259342s" Mar 21 14:09:42.434169 containerd[1478]: time="2025-03-21T14:09:42.433967383Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\"" Mar 21 14:09:42.435075 containerd[1478]: time="2025-03-21T14:09:42.434866159Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 21 14:09:44.508132 containerd[1478]: time="2025-03-21T14:09:44.508063470Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:44.509480 containerd[1478]: time="2025-03-21T14:09:44.509424743Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=18780376" Mar 21 14:09:44.511455 containerd[1478]: time="2025-03-21T14:09:44.511420516Z" level=info msg="ImageCreate event name:\"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:44.514239 containerd[1478]: time="2025-03-21T14:09:44.514204327Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:44.515511 containerd[1478]: time="2025-03-21T14:09:44.514789865Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"20267994\" in 2.079867792s" Mar 21 14:09:44.515511 containerd[1478]: time="2025-03-21T14:09:44.514822747Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\"" Mar 21 14:09:44.515511 containerd[1478]: time="2025-03-21T14:09:44.515303078Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 21 14:09:45.936104 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1276075106.mount: Deactivated successfully. Mar 21 14:09:46.482054 containerd[1478]: time="2025-03-21T14:09:46.482012340Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:46.483174 containerd[1478]: time="2025-03-21T14:09:46.483138673Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=30354638" Mar 21 14:09:46.484596 containerd[1478]: time="2025-03-21T14:09:46.484574355Z" level=info msg="ImageCreate event name:\"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:46.486895 containerd[1478]: time="2025-03-21T14:09:46.486875151Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:46.487486 containerd[1478]: time="2025-03-21T14:09:46.487443787Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"30353649\" in 1.972116093s" Mar 21 14:09:46.487536 containerd[1478]: time="2025-03-21T14:09:46.487486567Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\"" Mar 21 14:09:46.487887 containerd[1478]: time="2025-03-21T14:09:46.487862552Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 21 14:09:47.153056 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3593069969.mount: Deactivated successfully. Mar 21 14:09:48.663377 containerd[1478]: time="2025-03-21T14:09:48.663253508Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:48.665697 containerd[1478]: time="2025-03-21T14:09:48.665185421Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Mar 21 14:09:48.667364 containerd[1478]: time="2025-03-21T14:09:48.667257577Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:48.673632 containerd[1478]: time="2025-03-21T14:09:48.673536994Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:48.677448 containerd[1478]: time="2025-03-21T14:09:48.677202469Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.18929883s" Mar 21 14:09:48.677448 containerd[1478]: time="2025-03-21T14:09:48.677277520Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 21 14:09:48.678876 containerd[1478]: time="2025-03-21T14:09:48.678457713Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 21 14:09:49.220446 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount891734008.mount: Deactivated successfully. Mar 21 14:09:49.230664 containerd[1478]: time="2025-03-21T14:09:49.230336139Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 14:09:49.232416 containerd[1478]: time="2025-03-21T14:09:49.232303015Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 21 14:09:49.233975 containerd[1478]: time="2025-03-21T14:09:49.233798118Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 14:09:49.238535 containerd[1478]: time="2025-03-21T14:09:49.238400656Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 14:09:49.240652 containerd[1478]: time="2025-03-21T14:09:49.240391495Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 561.870917ms" Mar 21 14:09:49.240652 containerd[1478]: time="2025-03-21T14:09:49.240458100Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 21 14:09:49.242240 containerd[1478]: time="2025-03-21T14:09:49.242100767Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 21 14:09:49.896422 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3549338643.mount: Deactivated successfully. Mar 21 14:09:51.855765 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 21 14:09:51.858267 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:09:51.976811 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:09:51.987212 (kubelet)[2129]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 14:09:52.210134 kubelet[2129]: E0321 14:09:52.210009 2129 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 14:09:52.213582 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 14:09:52.213726 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 14:09:52.214493 systemd[1]: kubelet.service: Consumed 130ms CPU time, 95.7M memory peak. Mar 21 14:09:52.723983 containerd[1478]: time="2025-03-21T14:09:52.722084592Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:52.726606 containerd[1478]: time="2025-03-21T14:09:52.726525117Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56779981" Mar 21 14:09:52.793065 containerd[1478]: time="2025-03-21T14:09:52.792968925Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:52.803283 containerd[1478]: time="2025-03-21T14:09:52.803039640Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:09:52.807049 containerd[1478]: time="2025-03-21T14:09:52.806832460Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 3.564610823s" Mar 21 14:09:52.807049 containerd[1478]: time="2025-03-21T14:09:52.806899096Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Mar 21 14:09:56.874328 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:09:56.874509 systemd[1]: kubelet.service: Consumed 130ms CPU time, 95.7M memory peak. Mar 21 14:09:56.877921 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:09:56.936671 systemd[1]: Reload requested from client PID 2165 ('systemctl') (unit session-9.scope)... Mar 21 14:09:56.936707 systemd[1]: Reloading... Mar 21 14:09:57.044137 zram_generator::config[2211]: No configuration found. Mar 21 14:09:57.243415 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 14:09:57.361986 systemd[1]: Reloading finished in 424 ms. Mar 21 14:09:57.412681 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 21 14:09:57.412761 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 21 14:09:57.413095 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:09:57.413150 systemd[1]: kubelet.service: Consumed 132ms CPU time, 83.5M memory peak. Mar 21 14:09:57.414682 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:09:57.559967 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:09:57.575531 (kubelet)[2278]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 21 14:09:57.614196 kubelet[2278]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 14:09:57.614196 kubelet[2278]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 14:09:57.614196 kubelet[2278]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 14:09:57.614995 kubelet[2278]: I0321 14:09:57.614253 2278 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 14:09:58.052616 kubelet[2278]: I0321 14:09:58.052554 2278 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 21 14:09:58.052616 kubelet[2278]: I0321 14:09:58.052583 2278 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 14:09:58.056155 kubelet[2278]: I0321 14:09:58.054863 2278 server.go:929] "Client rotation is on, will bootstrap in background" Mar 21 14:09:58.093946 kubelet[2278]: I0321 14:09:58.093644 2278 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 21 14:09:58.096553 kubelet[2278]: E0321 14:09:58.096473 2278 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.24.4.61:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:58.105151 kubelet[2278]: I0321 14:09:58.105059 2278 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 14:09:58.109440 kubelet[2278]: I0321 14:09:58.109295 2278 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 21 14:09:58.109440 kubelet[2278]: I0321 14:09:58.109387 2278 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 21 14:09:58.109662 kubelet[2278]: I0321 14:09:58.109502 2278 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 14:09:58.109747 kubelet[2278]: I0321 14:09:58.109527 2278 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-3-a-8593155e6d.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 14:09:58.109747 kubelet[2278]: I0321 14:09:58.109703 2278 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 14:09:58.109747 kubelet[2278]: I0321 14:09:58.109712 2278 container_manager_linux.go:300] "Creating device plugin manager" Mar 21 14:09:58.110075 kubelet[2278]: I0321 14:09:58.109801 2278 state_mem.go:36] "Initialized new in-memory state store" Mar 21 14:09:58.113263 kubelet[2278]: I0321 14:09:58.113191 2278 kubelet.go:408] "Attempting to sync node with API server" Mar 21 14:09:58.113263 kubelet[2278]: I0321 14:09:58.113209 2278 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 14:09:58.113263 kubelet[2278]: I0321 14:09:58.113233 2278 kubelet.go:314] "Adding apiserver pod source" Mar 21 14:09:58.113263 kubelet[2278]: I0321 14:09:58.113250 2278 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 14:09:58.117047 kubelet[2278]: W0321 14:09:58.116956 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.61:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-a-8593155e6d.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:58.119155 kubelet[2278]: E0321 14:09:58.117312 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.61:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-a-8593155e6d.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:58.120984 kubelet[2278]: W0321 14:09:58.120926 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.61:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:58.120984 kubelet[2278]: E0321 14:09:58.120980 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.61:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:58.121382 kubelet[2278]: I0321 14:09:58.121349 2278 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 21 14:09:58.126573 kubelet[2278]: I0321 14:09:58.126463 2278 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 14:09:58.127954 kubelet[2278]: W0321 14:09:58.127648 2278 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 21 14:09:58.129668 kubelet[2278]: I0321 14:09:58.129502 2278 server.go:1269] "Started kubelet" Mar 21 14:09:58.135701 kubelet[2278]: I0321 14:09:58.134825 2278 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 14:09:58.136883 kubelet[2278]: I0321 14:09:58.136833 2278 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 14:09:58.137219 kubelet[2278]: I0321 14:09:58.137206 2278 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 14:09:58.139251 kubelet[2278]: I0321 14:09:58.139216 2278 server.go:460] "Adding debug handlers to kubelet server" Mar 21 14:09:58.139455 kubelet[2278]: I0321 14:09:58.139442 2278 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 14:09:58.150273 kubelet[2278]: I0321 14:09:58.150250 2278 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 21 14:09:58.150849 kubelet[2278]: E0321 14:09:58.144001 2278 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.61:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.61:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-9999-0-3-a-8593155e6d.novalocal.182ed6b9851c5bc6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-3-a-8593155e6d.novalocal,UID:ci-9999-0-3-a-8593155e6d.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-0-3-a-8593155e6d.novalocal,},FirstTimestamp:2025-03-21 14:09:58.129482694 +0000 UTC m=+0.550700083,LastTimestamp:2025-03-21 14:09:58.129482694 +0000 UTC m=+0.550700083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-3-a-8593155e6d.novalocal,}" Mar 21 14:09:58.152177 kubelet[2278]: I0321 14:09:58.152163 2278 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 21 14:09:58.153381 kubelet[2278]: E0321 14:09:58.153361 2278 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" Mar 21 14:09:58.156284 kubelet[2278]: E0321 14:09:58.156205 2278 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.61:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-a-8593155e6d.novalocal?timeout=10s\": dial tcp 172.24.4.61:6443: connect: connection refused" interval="200ms" Mar 21 14:09:58.156663 kubelet[2278]: I0321 14:09:58.156647 2278 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 21 14:09:58.156789 kubelet[2278]: I0321 14:09:58.156778 2278 reconciler.go:26] "Reconciler: start to sync state" Mar 21 14:09:58.159107 kubelet[2278]: W0321 14:09:58.159050 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.61:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:58.159201 kubelet[2278]: E0321 14:09:58.159176 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.61:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:58.159605 kubelet[2278]: I0321 14:09:58.159561 2278 factory.go:221] Registration of the containerd container factory successfully Mar 21 14:09:58.159605 kubelet[2278]: I0321 14:09:58.159601 2278 factory.go:221] Registration of the systemd container factory successfully Mar 21 14:09:58.159809 kubelet[2278]: I0321 14:09:58.159750 2278 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 21 14:09:58.171561 kubelet[2278]: I0321 14:09:58.171423 2278 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 14:09:58.172781 kubelet[2278]: I0321 14:09:58.172517 2278 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 14:09:58.172781 kubelet[2278]: I0321 14:09:58.172537 2278 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 14:09:58.172781 kubelet[2278]: I0321 14:09:58.172558 2278 kubelet.go:2321] "Starting kubelet main sync loop" Mar 21 14:09:58.172781 kubelet[2278]: E0321 14:09:58.172593 2278 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 14:09:58.182273 kubelet[2278]: W0321 14:09:58.181875 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.61:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:58.182273 kubelet[2278]: E0321 14:09:58.181925 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.61:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:58.192230 kubelet[2278]: E0321 14:09:58.192214 2278 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 21 14:09:58.197055 kubelet[2278]: I0321 14:09:58.197027 2278 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 21 14:09:58.197201 kubelet[2278]: I0321 14:09:58.197095 2278 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 21 14:09:58.197201 kubelet[2278]: I0321 14:09:58.197159 2278 state_mem.go:36] "Initialized new in-memory state store" Mar 21 14:09:58.202006 kubelet[2278]: I0321 14:09:58.201980 2278 policy_none.go:49] "None policy: Start" Mar 21 14:09:58.202533 kubelet[2278]: I0321 14:09:58.202520 2278 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 14:09:58.202950 kubelet[2278]: I0321 14:09:58.202656 2278 state_mem.go:35] "Initializing new in-memory state store" Mar 21 14:09:58.210010 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 21 14:09:58.221106 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 21 14:09:58.224617 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 21 14:09:58.234713 kubelet[2278]: I0321 14:09:58.234694 2278 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 14:09:58.235297 kubelet[2278]: I0321 14:09:58.234919 2278 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 14:09:58.235297 kubelet[2278]: I0321 14:09:58.234933 2278 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 14:09:58.235297 kubelet[2278]: I0321 14:09:58.235139 2278 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 14:09:58.236660 kubelet[2278]: E0321 14:09:58.236643 2278 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" Mar 21 14:09:58.285962 systemd[1]: Created slice kubepods-burstable-poda9b44657dfb543dac125ddc639996f18.slice - libcontainer container kubepods-burstable-poda9b44657dfb543dac125ddc639996f18.slice. Mar 21 14:09:58.313938 systemd[1]: Created slice kubepods-burstable-pod6db87c0627711911dce972e3cd1df479.slice - libcontainer container kubepods-burstable-pod6db87c0627711911dce972e3cd1df479.slice. Mar 21 14:09:58.324277 systemd[1]: Created slice kubepods-burstable-pod04724a29ffa5e1f6acd7d2bc11fcedad.slice - libcontainer container kubepods-burstable-pod04724a29ffa5e1f6acd7d2bc11fcedad.slice. Mar 21 14:09:58.339327 kubelet[2278]: I0321 14:09:58.339287 2278 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.339973 kubelet[2278]: E0321 14:09:58.339943 2278 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.61:6443/api/v1/nodes\": dial tcp 172.24.4.61:6443: connect: connection refused" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.357927 kubelet[2278]: I0321 14:09:58.357691 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/04724a29ffa5e1f6acd7d2bc11fcedad-kubeconfig\") pod \"kube-scheduler-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"04724a29ffa5e1f6acd7d2bc11fcedad\") " pod="kube-system/kube-scheduler-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.357927 kubelet[2278]: I0321 14:09:58.357722 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6db87c0627711911dce972e3cd1df479-ca-certs\") pod \"kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"6db87c0627711911dce972e3cd1df479\") " pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.357927 kubelet[2278]: I0321 14:09:58.357742 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6db87c0627711911dce972e3cd1df479-k8s-certs\") pod \"kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"6db87c0627711911dce972e3cd1df479\") " pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.357927 kubelet[2278]: E0321 14:09:58.357732 2278 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.61:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-a-8593155e6d.novalocal?timeout=10s\": dial tcp 172.24.4.61:6443: connect: connection refused" interval="400ms" Mar 21 14:09:58.357927 kubelet[2278]: I0321 14:09:58.357760 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6db87c0627711911dce972e3cd1df479-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"6db87c0627711911dce972e3cd1df479\") " pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.358101 kubelet[2278]: I0321 14:09:58.357808 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-ca-certs\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.358101 kubelet[2278]: I0321 14:09:58.357829 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.358101 kubelet[2278]: I0321 14:09:58.357847 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.358101 kubelet[2278]: I0321 14:09:58.357865 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.358230 kubelet[2278]: I0321 14:09:58.357885 2278 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.544608 kubelet[2278]: I0321 14:09:58.544422 2278 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.545220 kubelet[2278]: E0321 14:09:58.545085 2278 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.61:6443/api/v1/nodes\": dial tcp 172.24.4.61:6443: connect: connection refused" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.612102 containerd[1478]: time="2025-03-21T14:09:58.611761222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal,Uid:a9b44657dfb543dac125ddc639996f18,Namespace:kube-system,Attempt:0,}" Mar 21 14:09:58.618457 containerd[1478]: time="2025-03-21T14:09:58.618304301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal,Uid:6db87c0627711911dce972e3cd1df479,Namespace:kube-system,Attempt:0,}" Mar 21 14:09:58.628095 containerd[1478]: time="2025-03-21T14:09:58.627951573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-3-a-8593155e6d.novalocal,Uid:04724a29ffa5e1f6acd7d2bc11fcedad,Namespace:kube-system,Attempt:0,}" Mar 21 14:09:58.759938 kubelet[2278]: E0321 14:09:58.759731 2278 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.61:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-a-8593155e6d.novalocal?timeout=10s\": dial tcp 172.24.4.61:6443: connect: connection refused" interval="800ms" Mar 21 14:09:58.948546 kubelet[2278]: I0321 14:09:58.948379 2278 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:58.949766 kubelet[2278]: E0321 14:09:58.949699 2278 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.61:6443/api/v1/nodes\": dial tcp 172.24.4.61:6443: connect: connection refused" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:59.368515 kubelet[2278]: W0321 14:09:59.368303 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.61:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-a-8593155e6d.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:59.368515 kubelet[2278]: E0321 14:09:59.368447 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.61:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-a-8593155e6d.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:59.377966 containerd[1478]: time="2025-03-21T14:09:59.377627920Z" level=info msg="connecting to shim eb5fb4fea642e6c494ed7b92257f8dfcc6e5a884f1d587b308ee1e26add59e41" address="unix:///run/containerd/s/88c9d77dea06e93d8d96f15a2f63660fb59b7b50cd71525e4333efead644fe4d" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:09:59.421751 kubelet[2278]: W0321 14:09:59.421669 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.61:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:59.421751 kubelet[2278]: E0321 14:09:59.421719 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.61:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:59.438713 systemd[1]: Started cri-containerd-eb5fb4fea642e6c494ed7b92257f8dfcc6e5a884f1d587b308ee1e26add59e41.scope - libcontainer container eb5fb4fea642e6c494ed7b92257f8dfcc6e5a884f1d587b308ee1e26add59e41. Mar 21 14:09:59.440874 containerd[1478]: time="2025-03-21T14:09:59.440831338Z" level=info msg="connecting to shim b0223697fb2317e1170d054c009fe08cde1bfc2cc25ca5f08142cddf43a81ce4" address="unix:///run/containerd/s/7633cdb1f101e6c0fa6fa052c94593dc77dc8bfc4191fa56c35b8c90010c4e98" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:09:59.442158 containerd[1478]: time="2025-03-21T14:09:59.441241365Z" level=info msg="connecting to shim 625fe4d6ecf88593ce35fad55b936e4ba145d9e725a40e72ef0a3a477ca60713" address="unix:///run/containerd/s/9e604b5506aa9bc2111fbe169f7319e63a52c957017d5eb9f539a3cff3e62a0e" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:09:59.478238 systemd[1]: Started cri-containerd-b0223697fb2317e1170d054c009fe08cde1bfc2cc25ca5f08142cddf43a81ce4.scope - libcontainer container b0223697fb2317e1170d054c009fe08cde1bfc2cc25ca5f08142cddf43a81ce4. Mar 21 14:09:59.500276 systemd[1]: Started cri-containerd-625fe4d6ecf88593ce35fad55b936e4ba145d9e725a40e72ef0a3a477ca60713.scope - libcontainer container 625fe4d6ecf88593ce35fad55b936e4ba145d9e725a40e72ef0a3a477ca60713. Mar 21 14:09:59.516642 kubelet[2278]: W0321 14:09:59.516470 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.61:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:59.516793 kubelet[2278]: E0321 14:09:59.516770 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.61:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:59.531854 containerd[1478]: time="2025-03-21T14:09:59.531812068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal,Uid:a9b44657dfb543dac125ddc639996f18,Namespace:kube-system,Attempt:0,} returns sandbox id \"eb5fb4fea642e6c494ed7b92257f8dfcc6e5a884f1d587b308ee1e26add59e41\"" Mar 21 14:09:59.536233 containerd[1478]: time="2025-03-21T14:09:59.536188485Z" level=info msg="CreateContainer within sandbox \"eb5fb4fea642e6c494ed7b92257f8dfcc6e5a884f1d587b308ee1e26add59e41\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 21 14:09:59.551867 containerd[1478]: time="2025-03-21T14:09:59.551826366Z" level=info msg="Container 7930f9ee33bb80866962c723739e6d9b887f2a2c840f7edd1e3bbd440055888d: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:09:59.553226 kubelet[2278]: W0321 14:09:59.552585 2278 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.61:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.61:6443: connect: connection refused Mar 21 14:09:59.553226 kubelet[2278]: E0321 14:09:59.552650 2278 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.61:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.61:6443: connect: connection refused" logger="UnhandledError" Mar 21 14:09:59.560995 kubelet[2278]: E0321 14:09:59.560942 2278 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.61:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-a-8593155e6d.novalocal?timeout=10s\": dial tcp 172.24.4.61:6443: connect: connection refused" interval="1.6s" Mar 21 14:09:59.564244 containerd[1478]: time="2025-03-21T14:09:59.564159126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal,Uid:6db87c0627711911dce972e3cd1df479,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0223697fb2317e1170d054c009fe08cde1bfc2cc25ca5f08142cddf43a81ce4\"" Mar 21 14:09:59.567406 containerd[1478]: time="2025-03-21T14:09:59.567230036Z" level=info msg="CreateContainer within sandbox \"eb5fb4fea642e6c494ed7b92257f8dfcc6e5a884f1d587b308ee1e26add59e41\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7930f9ee33bb80866962c723739e6d9b887f2a2c840f7edd1e3bbd440055888d\"" Mar 21 14:09:59.567406 containerd[1478]: time="2025-03-21T14:09:59.567310350Z" level=info msg="CreateContainer within sandbox \"b0223697fb2317e1170d054c009fe08cde1bfc2cc25ca5f08142cddf43a81ce4\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 21 14:09:59.583027 containerd[1478]: time="2025-03-21T14:09:59.582479790Z" level=info msg="Container 7773f1a0d78b5f8b3c77d02df6e96962f9a845d32bced4a0430e7729f964bb8d: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:09:59.587003 containerd[1478]: time="2025-03-21T14:09:59.586975672Z" level=info msg="StartContainer for \"7930f9ee33bb80866962c723739e6d9b887f2a2c840f7edd1e3bbd440055888d\"" Mar 21 14:09:59.588725 containerd[1478]: time="2025-03-21T14:09:59.588638120Z" level=info msg="connecting to shim 7930f9ee33bb80866962c723739e6d9b887f2a2c840f7edd1e3bbd440055888d" address="unix:///run/containerd/s/88c9d77dea06e93d8d96f15a2f63660fb59b7b50cd71525e4333efead644fe4d" protocol=ttrpc version=3 Mar 21 14:09:59.593586 containerd[1478]: time="2025-03-21T14:09:59.593454478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-3-a-8593155e6d.novalocal,Uid:04724a29ffa5e1f6acd7d2bc11fcedad,Namespace:kube-system,Attempt:0,} returns sandbox id \"625fe4d6ecf88593ce35fad55b936e4ba145d9e725a40e72ef0a3a477ca60713\"" Mar 21 14:09:59.596150 containerd[1478]: time="2025-03-21T14:09:59.595855574Z" level=info msg="CreateContainer within sandbox \"b0223697fb2317e1170d054c009fe08cde1bfc2cc25ca5f08142cddf43a81ce4\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7773f1a0d78b5f8b3c77d02df6e96962f9a845d32bced4a0430e7729f964bb8d\"" Mar 21 14:09:59.596482 containerd[1478]: time="2025-03-21T14:09:59.596454641Z" level=info msg="StartContainer for \"7773f1a0d78b5f8b3c77d02df6e96962f9a845d32bced4a0430e7729f964bb8d\"" Mar 21 14:09:59.598194 containerd[1478]: time="2025-03-21T14:09:59.598166148Z" level=info msg="CreateContainer within sandbox \"625fe4d6ecf88593ce35fad55b936e4ba145d9e725a40e72ef0a3a477ca60713\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 21 14:09:59.599797 containerd[1478]: time="2025-03-21T14:09:59.599743774Z" level=info msg="connecting to shim 7773f1a0d78b5f8b3c77d02df6e96962f9a845d32bced4a0430e7729f964bb8d" address="unix:///run/containerd/s/7633cdb1f101e6c0fa6fa052c94593dc77dc8bfc4191fa56c35b8c90010c4e98" protocol=ttrpc version=3 Mar 21 14:09:59.614480 systemd[1]: Started cri-containerd-7930f9ee33bb80866962c723739e6d9b887f2a2c840f7edd1e3bbd440055888d.scope - libcontainer container 7930f9ee33bb80866962c723739e6d9b887f2a2c840f7edd1e3bbd440055888d. Mar 21 14:09:59.616393 containerd[1478]: time="2025-03-21T14:09:59.615641543Z" level=info msg="Container bfbc57092d08dc3985b07420ff29aa362fff7c30abdac564426d2450dfd04706: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:09:59.627296 systemd[1]: Started cri-containerd-7773f1a0d78b5f8b3c77d02df6e96962f9a845d32bced4a0430e7729f964bb8d.scope - libcontainer container 7773f1a0d78b5f8b3c77d02df6e96962f9a845d32bced4a0430e7729f964bb8d. Mar 21 14:09:59.634811 containerd[1478]: time="2025-03-21T14:09:59.634767354Z" level=info msg="CreateContainer within sandbox \"625fe4d6ecf88593ce35fad55b936e4ba145d9e725a40e72ef0a3a477ca60713\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"bfbc57092d08dc3985b07420ff29aa362fff7c30abdac564426d2450dfd04706\"" Mar 21 14:09:59.636262 containerd[1478]: time="2025-03-21T14:09:59.635423643Z" level=info msg="StartContainer for \"bfbc57092d08dc3985b07420ff29aa362fff7c30abdac564426d2450dfd04706\"" Mar 21 14:09:59.636657 containerd[1478]: time="2025-03-21T14:09:59.636625574Z" level=info msg="connecting to shim bfbc57092d08dc3985b07420ff29aa362fff7c30abdac564426d2450dfd04706" address="unix:///run/containerd/s/9e604b5506aa9bc2111fbe169f7319e63a52c957017d5eb9f539a3cff3e62a0e" protocol=ttrpc version=3 Mar 21 14:09:59.668281 systemd[1]: Started cri-containerd-bfbc57092d08dc3985b07420ff29aa362fff7c30abdac564426d2450dfd04706.scope - libcontainer container bfbc57092d08dc3985b07420ff29aa362fff7c30abdac564426d2450dfd04706. Mar 21 14:09:59.710381 containerd[1478]: time="2025-03-21T14:09:59.710328134Z" level=info msg="StartContainer for \"7930f9ee33bb80866962c723739e6d9b887f2a2c840f7edd1e3bbd440055888d\" returns successfully" Mar 21 14:09:59.724888 containerd[1478]: time="2025-03-21T14:09:59.724848056Z" level=info msg="StartContainer for \"7773f1a0d78b5f8b3c77d02df6e96962f9a845d32bced4a0430e7729f964bb8d\" returns successfully" Mar 21 14:09:59.756701 kubelet[2278]: I0321 14:09:59.753703 2278 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:59.756701 kubelet[2278]: E0321 14:09:59.754010 2278 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.61:6443/api/v1/nodes\": dial tcp 172.24.4.61:6443: connect: connection refused" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:09:59.768606 containerd[1478]: time="2025-03-21T14:09:59.768572487Z" level=info msg="StartContainer for \"bfbc57092d08dc3985b07420ff29aa362fff7c30abdac564426d2450dfd04706\" returns successfully" Mar 21 14:10:01.070245 update_engine[1457]: I20250321 14:10:01.070168 1457 update_attempter.cc:509] Updating boot flags... Mar 21 14:10:01.111924 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2552) Mar 21 14:10:01.257825 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2555) Mar 21 14:10:01.357088 kubelet[2278]: I0321 14:10:01.356739 2278 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:01.993665 kubelet[2278]: E0321 14:10:01.993627 2278 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:02.037123 kubelet[2278]: I0321 14:10:02.037080 2278 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:02.037267 kubelet[2278]: E0321 14:10:02.037138 2278 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-9999-0-3-a-8593155e6d.novalocal\": node \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" Mar 21 14:10:02.053405 kubelet[2278]: E0321 14:10:02.053365 2278 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" Mar 21 14:10:02.154282 kubelet[2278]: E0321 14:10:02.154247 2278 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" Mar 21 14:10:02.256102 kubelet[2278]: E0321 14:10:02.255233 2278 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" Mar 21 14:10:03.120330 kubelet[2278]: I0321 14:10:03.119813 2278 apiserver.go:52] "Watching apiserver" Mar 21 14:10:03.157722 kubelet[2278]: I0321 14:10:03.157560 2278 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 21 14:10:04.546283 kubelet[2278]: W0321 14:10:04.545610 2278 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 14:10:04.795897 systemd[1]: Reload requested from client PID 2560 ('systemctl') (unit session-9.scope)... Mar 21 14:10:04.795935 systemd[1]: Reloading... Mar 21 14:10:04.921149 zram_generator::config[2609]: No configuration found. Mar 21 14:10:05.075711 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 14:10:05.214712 systemd[1]: Reloading finished in 418 ms. Mar 21 14:10:05.237931 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:10:05.250431 systemd[1]: kubelet.service: Deactivated successfully. Mar 21 14:10:05.250722 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:10:05.250847 systemd[1]: kubelet.service: Consumed 1.061s CPU time, 115.9M memory peak. Mar 21 14:10:05.253268 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 14:10:05.370832 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 14:10:05.381442 (kubelet)[2670]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 21 14:10:05.426874 kubelet[2670]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 14:10:05.428833 kubelet[2670]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 14:10:05.428833 kubelet[2670]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 14:10:05.428833 kubelet[2670]: I0321 14:10:05.427290 2670 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 14:10:05.433983 kubelet[2670]: I0321 14:10:05.433853 2670 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 21 14:10:05.433983 kubelet[2670]: I0321 14:10:05.433874 2670 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 14:10:05.434418 kubelet[2670]: I0321 14:10:05.434336 2670 server.go:929] "Client rotation is on, will bootstrap in background" Mar 21 14:10:05.436969 kubelet[2670]: I0321 14:10:05.436721 2670 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 14:10:05.440539 kubelet[2670]: I0321 14:10:05.440022 2670 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 21 14:10:05.447611 kubelet[2670]: I0321 14:10:05.447219 2670 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 14:10:05.450745 kubelet[2670]: I0321 14:10:05.450345 2670 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 21 14:10:05.450745 kubelet[2670]: I0321 14:10:05.450499 2670 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 21 14:10:05.450745 kubelet[2670]: I0321 14:10:05.450627 2670 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 14:10:05.450977 kubelet[2670]: I0321 14:10:05.450649 2670 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-3-a-8593155e6d.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 14:10:05.451134 kubelet[2670]: I0321 14:10:05.451122 2670 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 14:10:05.451191 kubelet[2670]: I0321 14:10:05.451184 2670 container_manager_linux.go:300] "Creating device plugin manager" Mar 21 14:10:05.451298 kubelet[2670]: I0321 14:10:05.451287 2670 state_mem.go:36] "Initialized new in-memory state store" Mar 21 14:10:05.451449 kubelet[2670]: I0321 14:10:05.451438 2670 kubelet.go:408] "Attempting to sync node with API server" Mar 21 14:10:05.451521 kubelet[2670]: I0321 14:10:05.451512 2670 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 14:10:05.451598 kubelet[2670]: I0321 14:10:05.451589 2670 kubelet.go:314] "Adding apiserver pod source" Mar 21 14:10:05.451668 kubelet[2670]: I0321 14:10:05.451659 2670 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 14:10:05.452572 kubelet[2670]: I0321 14:10:05.452548 2670 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 21 14:10:05.452968 kubelet[2670]: I0321 14:10:05.452947 2670 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 14:10:05.453401 kubelet[2670]: I0321 14:10:05.453380 2670 server.go:1269] "Started kubelet" Mar 21 14:10:05.455647 kubelet[2670]: I0321 14:10:05.455622 2670 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 14:10:05.462793 kubelet[2670]: I0321 14:10:05.462756 2670 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 14:10:05.466122 kubelet[2670]: I0321 14:10:05.464467 2670 server.go:460] "Adding debug handlers to kubelet server" Mar 21 14:10:05.468814 kubelet[2670]: I0321 14:10:05.467325 2670 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 14:10:05.468814 kubelet[2670]: I0321 14:10:05.467623 2670 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 14:10:05.468814 kubelet[2670]: I0321 14:10:05.468006 2670 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 21 14:10:05.472126 kubelet[2670]: I0321 14:10:05.471055 2670 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 21 14:10:05.472126 kubelet[2670]: E0321 14:10:05.471305 2670 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-3-a-8593155e6d.novalocal\" not found" Mar 21 14:10:05.475121 kubelet[2670]: I0321 14:10:05.474305 2670 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 21 14:10:05.475121 kubelet[2670]: I0321 14:10:05.474423 2670 reconciler.go:26] "Reconciler: start to sync state" Mar 21 14:10:05.480143 kubelet[2670]: I0321 14:10:05.476884 2670 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 14:10:05.480143 kubelet[2670]: I0321 14:10:05.477742 2670 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 14:10:05.480143 kubelet[2670]: I0321 14:10:05.477761 2670 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 14:10:05.480143 kubelet[2670]: I0321 14:10:05.477780 2670 kubelet.go:2321] "Starting kubelet main sync loop" Mar 21 14:10:05.480143 kubelet[2670]: E0321 14:10:05.477812 2670 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 14:10:05.493853 kubelet[2670]: I0321 14:10:05.493503 2670 factory.go:221] Registration of the systemd container factory successfully Mar 21 14:10:05.494058 kubelet[2670]: I0321 14:10:05.494039 2670 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 21 14:10:05.495825 kubelet[2670]: I0321 14:10:05.495808 2670 factory.go:221] Registration of the containerd container factory successfully Mar 21 14:10:05.497802 kubelet[2670]: E0321 14:10:05.497764 2670 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 21 14:10:05.547338 kubelet[2670]: I0321 14:10:05.547308 2670 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 21 14:10:05.547338 kubelet[2670]: I0321 14:10:05.547326 2670 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 21 14:10:05.547491 kubelet[2670]: I0321 14:10:05.547394 2670 state_mem.go:36] "Initialized new in-memory state store" Mar 21 14:10:05.547583 kubelet[2670]: I0321 14:10:05.547553 2670 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 21 14:10:05.547621 kubelet[2670]: I0321 14:10:05.547574 2670 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 21 14:10:05.547621 kubelet[2670]: I0321 14:10:05.547600 2670 policy_none.go:49] "None policy: Start" Mar 21 14:10:05.548698 kubelet[2670]: I0321 14:10:05.548678 2670 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 14:10:05.548698 kubelet[2670]: I0321 14:10:05.548700 2670 state_mem.go:35] "Initializing new in-memory state store" Mar 21 14:10:05.548855 kubelet[2670]: I0321 14:10:05.548836 2670 state_mem.go:75] "Updated machine memory state" Mar 21 14:10:05.557274 kubelet[2670]: I0321 14:10:05.557244 2670 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 14:10:05.557598 kubelet[2670]: I0321 14:10:05.557578 2670 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 14:10:05.557645 kubelet[2670]: I0321 14:10:05.557594 2670 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 14:10:05.557969 kubelet[2670]: I0321 14:10:05.557879 2670 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 14:10:05.585221 kubelet[2670]: W0321 14:10:05.585195 2670 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 14:10:05.606105 kubelet[2670]: W0321 14:10:05.605661 2670 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 14:10:05.606105 kubelet[2670]: W0321 14:10:05.605888 2670 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 14:10:05.606105 kubelet[2670]: E0321 14:10:05.605929 2670 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" already exists" pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.661647 kubelet[2670]: I0321 14:10:05.661443 2670 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.674276 kubelet[2670]: I0321 14:10:05.673489 2670 kubelet_node_status.go:111] "Node was previously registered" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.674276 kubelet[2670]: I0321 14:10:05.673636 2670 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.775975 kubelet[2670]: I0321 14:10:05.775690 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-ca-certs\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.775975 kubelet[2670]: I0321 14:10:05.775760 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.775975 kubelet[2670]: I0321 14:10:05.775801 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.775975 kubelet[2670]: I0321 14:10:05.775836 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/04724a29ffa5e1f6acd7d2bc11fcedad-kubeconfig\") pod \"kube-scheduler-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"04724a29ffa5e1f6acd7d2bc11fcedad\") " pod="kube-system/kube-scheduler-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.775975 kubelet[2670]: I0321 14:10:05.775870 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6db87c0627711911dce972e3cd1df479-ca-certs\") pod \"kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"6db87c0627711911dce972e3cd1df479\") " pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.776499 kubelet[2670]: I0321 14:10:05.776108 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6db87c0627711911dce972e3cd1df479-k8s-certs\") pod \"kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"6db87c0627711911dce972e3cd1df479\") " pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.776499 kubelet[2670]: I0321 14:10:05.776284 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6db87c0627711911dce972e3cd1df479-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"6db87c0627711911dce972e3cd1df479\") " pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.776499 kubelet[2670]: I0321 14:10:05.776375 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:05.776499 kubelet[2670]: I0321 14:10:05.776466 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a9b44657dfb543dac125ddc639996f18-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal\" (UID: \"a9b44657dfb543dac125ddc639996f18\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:06.453458 kubelet[2670]: I0321 14:10:06.453207 2670 apiserver.go:52] "Watching apiserver" Mar 21 14:10:06.475483 kubelet[2670]: I0321 14:10:06.475338 2670 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 21 14:10:06.544671 kubelet[2670]: W0321 14:10:06.544509 2670 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 14:10:06.546044 kubelet[2670]: E0321 14:10:06.545873 2670 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:06.592037 kubelet[2670]: I0321 14:10:06.590750 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-9999-0-3-a-8593155e6d.novalocal" podStartSLOduration=1.590715983 podStartE2EDuration="1.590715983s" podCreationTimestamp="2025-03-21 14:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 14:10:06.563481644 +0000 UTC m=+1.177656731" watchObservedRunningTime="2025-03-21 14:10:06.590715983 +0000 UTC m=+1.204891110" Mar 21 14:10:06.603818 kubelet[2670]: I0321 14:10:06.603667 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-9999-0-3-a-8593155e6d.novalocal" podStartSLOduration=1.603650359 podStartE2EDuration="1.603650359s" podCreationTimestamp="2025-03-21 14:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 14:10:06.59109782 +0000 UTC m=+1.205272947" watchObservedRunningTime="2025-03-21 14:10:06.603650359 +0000 UTC m=+1.217825447" Mar 21 14:10:06.617703 kubelet[2670]: I0321 14:10:06.617463 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-9999-0-3-a-8593155e6d.novalocal" podStartSLOduration=2.617450318 podStartE2EDuration="2.617450318s" podCreationTimestamp="2025-03-21 14:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 14:10:06.604451313 +0000 UTC m=+1.218626400" watchObservedRunningTime="2025-03-21 14:10:06.617450318 +0000 UTC m=+1.231625405" Mar 21 14:10:09.450325 kubelet[2670]: I0321 14:10:09.450184 2670 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 21 14:10:09.450673 containerd[1478]: time="2025-03-21T14:10:09.450523053Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 21 14:10:09.451188 kubelet[2670]: I0321 14:10:09.450964 2670 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 21 14:10:09.630879 systemd[1]: Created slice kubepods-besteffort-pod6749d138_c116_43ab_b973_2155d5794d64.slice - libcontainer container kubepods-besteffort-pod6749d138_c116_43ab_b973_2155d5794d64.slice. Mar 21 14:10:09.706443 kubelet[2670]: I0321 14:10:09.706161 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6749d138-c116-43ab-b973-2155d5794d64-xtables-lock\") pod \"kube-proxy-qwvwh\" (UID: \"6749d138-c116-43ab-b973-2155d5794d64\") " pod="kube-system/kube-proxy-qwvwh" Mar 21 14:10:09.706443 kubelet[2670]: I0321 14:10:09.706236 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6749d138-c116-43ab-b973-2155d5794d64-lib-modules\") pod \"kube-proxy-qwvwh\" (UID: \"6749d138-c116-43ab-b973-2155d5794d64\") " pod="kube-system/kube-proxy-qwvwh" Mar 21 14:10:09.706443 kubelet[2670]: I0321 14:10:09.706289 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6749d138-c116-43ab-b973-2155d5794d64-kube-proxy\") pod \"kube-proxy-qwvwh\" (UID: \"6749d138-c116-43ab-b973-2155d5794d64\") " pod="kube-system/kube-proxy-qwvwh" Mar 21 14:10:09.706443 kubelet[2670]: I0321 14:10:09.706341 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548jk\" (UniqueName: \"kubernetes.io/projected/6749d138-c116-43ab-b973-2155d5794d64-kube-api-access-548jk\") pod \"kube-proxy-qwvwh\" (UID: \"6749d138-c116-43ab-b973-2155d5794d64\") " pod="kube-system/kube-proxy-qwvwh" Mar 21 14:10:09.824033 kubelet[2670]: E0321 14:10:09.822468 2670 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 21 14:10:09.824033 kubelet[2670]: E0321 14:10:09.822528 2670 projected.go:194] Error preparing data for projected volume kube-api-access-548jk for pod kube-system/kube-proxy-qwvwh: configmap "kube-root-ca.crt" not found Mar 21 14:10:09.824033 kubelet[2670]: E0321 14:10:09.822656 2670 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6749d138-c116-43ab-b973-2155d5794d64-kube-api-access-548jk podName:6749d138-c116-43ab-b973-2155d5794d64 nodeName:}" failed. No retries permitted until 2025-03-21 14:10:10.322613404 +0000 UTC m=+4.936788531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-548jk" (UniqueName: "kubernetes.io/projected/6749d138-c116-43ab-b973-2155d5794d64-kube-api-access-548jk") pod "kube-proxy-qwvwh" (UID: "6749d138-c116-43ab-b973-2155d5794d64") : configmap "kube-root-ca.crt" not found Mar 21 14:10:10.547082 containerd[1478]: time="2025-03-21T14:10:10.546946416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qwvwh,Uid:6749d138-c116-43ab-b973-2155d5794d64,Namespace:kube-system,Attempt:0,}" Mar 21 14:10:10.789837 systemd[1]: Created slice kubepods-besteffort-pod4848e03a_d69d_48d5_a2e3_b3e1ec3fa8d3.slice - libcontainer container kubepods-besteffort-pod4848e03a_d69d_48d5_a2e3_b3e1ec3fa8d3.slice. Mar 21 14:10:10.814456 kubelet[2670]: I0321 14:10:10.813966 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4848e03a-d69d-48d5-a2e3-b3e1ec3fa8d3-var-lib-calico\") pod \"tigera-operator-64ff5465b7-s6wfq\" (UID: \"4848e03a-d69d-48d5-a2e3-b3e1ec3fa8d3\") " pod="tigera-operator/tigera-operator-64ff5465b7-s6wfq" Mar 21 14:10:10.814456 kubelet[2670]: I0321 14:10:10.814071 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m69zv\" (UniqueName: \"kubernetes.io/projected/4848e03a-d69d-48d5-a2e3-b3e1ec3fa8d3-kube-api-access-m69zv\") pod \"tigera-operator-64ff5465b7-s6wfq\" (UID: \"4848e03a-d69d-48d5-a2e3-b3e1ec3fa8d3\") " pod="tigera-operator/tigera-operator-64ff5465b7-s6wfq" Mar 21 14:10:11.109381 containerd[1478]: time="2025-03-21T14:10:11.109260399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-s6wfq,Uid:4848e03a-d69d-48d5-a2e3-b3e1ec3fa8d3,Namespace:tigera-operator,Attempt:0,}" Mar 21 14:10:11.167797 containerd[1478]: time="2025-03-21T14:10:11.166920280Z" level=info msg="connecting to shim cf69642f75d6a722580afb9c2505a1ec9537a9863454356a051508acf3a090d2" address="unix:///run/containerd/s/6a4256ae619c82fc4d89cc3fce4bd0d16e1cf7d31d01ea717b6bd18553f5b650" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:11.201010 containerd[1478]: time="2025-03-21T14:10:11.200917542Z" level=info msg="connecting to shim df58d80b643cff726614e4ee90f01ab7e12eafc194a5db7b8682c9977d701ba6" address="unix:///run/containerd/s/d17b36eb3702035cd80b5cbe2205f73f15981b715cd1f7c7341755d29887237f" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:11.230366 systemd[1]: Started cri-containerd-cf69642f75d6a722580afb9c2505a1ec9537a9863454356a051508acf3a090d2.scope - libcontainer container cf69642f75d6a722580afb9c2505a1ec9537a9863454356a051508acf3a090d2. Mar 21 14:10:11.238484 systemd[1]: Started cri-containerd-df58d80b643cff726614e4ee90f01ab7e12eafc194a5db7b8682c9977d701ba6.scope - libcontainer container df58d80b643cff726614e4ee90f01ab7e12eafc194a5db7b8682c9977d701ba6. Mar 21 14:10:11.266227 containerd[1478]: time="2025-03-21T14:10:11.266168980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qwvwh,Uid:6749d138-c116-43ab-b973-2155d5794d64,Namespace:kube-system,Attempt:0,} returns sandbox id \"cf69642f75d6a722580afb9c2505a1ec9537a9863454356a051508acf3a090d2\"" Mar 21 14:10:11.271292 containerd[1478]: time="2025-03-21T14:10:11.271259324Z" level=info msg="CreateContainer within sandbox \"cf69642f75d6a722580afb9c2505a1ec9537a9863454356a051508acf3a090d2\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 21 14:10:11.287643 containerd[1478]: time="2025-03-21T14:10:11.287000722Z" level=info msg="Container b0b6e7df110856422bafcdd03adddb39e4abaf654bd0866cfd441d2db229c4f3: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:11.307685 containerd[1478]: time="2025-03-21T14:10:11.307513968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-s6wfq,Uid:4848e03a-d69d-48d5-a2e3-b3e1ec3fa8d3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"df58d80b643cff726614e4ee90f01ab7e12eafc194a5db7b8682c9977d701ba6\"" Mar 21 14:10:11.311340 containerd[1478]: time="2025-03-21T14:10:11.310488159Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 21 14:10:11.311340 containerd[1478]: time="2025-03-21T14:10:11.311134288Z" level=info msg="CreateContainer within sandbox \"cf69642f75d6a722580afb9c2505a1ec9537a9863454356a051508acf3a090d2\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b0b6e7df110856422bafcdd03adddb39e4abaf654bd0866cfd441d2db229c4f3\"" Mar 21 14:10:11.317846 containerd[1478]: time="2025-03-21T14:10:11.317170312Z" level=info msg="StartContainer for \"b0b6e7df110856422bafcdd03adddb39e4abaf654bd0866cfd441d2db229c4f3\"" Mar 21 14:10:11.321281 containerd[1478]: time="2025-03-21T14:10:11.321231713Z" level=info msg="connecting to shim b0b6e7df110856422bafcdd03adddb39e4abaf654bd0866cfd441d2db229c4f3" address="unix:///run/containerd/s/6a4256ae619c82fc4d89cc3fce4bd0d16e1cf7d31d01ea717b6bd18553f5b650" protocol=ttrpc version=3 Mar 21 14:10:11.344300 systemd[1]: Started cri-containerd-b0b6e7df110856422bafcdd03adddb39e4abaf654bd0866cfd441d2db229c4f3.scope - libcontainer container b0b6e7df110856422bafcdd03adddb39e4abaf654bd0866cfd441d2db229c4f3. Mar 21 14:10:11.391479 containerd[1478]: time="2025-03-21T14:10:11.391388564Z" level=info msg="StartContainer for \"b0b6e7df110856422bafcdd03adddb39e4abaf654bd0866cfd441d2db229c4f3\" returns successfully" Mar 21 14:10:11.566380 kubelet[2670]: I0321 14:10:11.564460 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qwvwh" podStartSLOduration=2.564442467 podStartE2EDuration="2.564442467s" podCreationTimestamp="2025-03-21 14:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 14:10:11.564360977 +0000 UTC m=+6.178536064" watchObservedRunningTime="2025-03-21 14:10:11.564442467 +0000 UTC m=+6.178617554" Mar 21 14:10:11.651366 sudo[1719]: pam_unix(sudo:session): session closed for user root Mar 21 14:10:11.799546 sshd[1718]: Connection closed by 172.24.4.1 port 33364 Mar 21 14:10:11.799369 sshd-session[1715]: pam_unix(sshd:session): session closed for user core Mar 21 14:10:11.803011 systemd[1]: sshd@6-172.24.4.61:22-172.24.4.1:33364.service: Deactivated successfully. Mar 21 14:10:11.805474 systemd[1]: session-9.scope: Deactivated successfully. Mar 21 14:10:11.805677 systemd[1]: session-9.scope: Consumed 7.006s CPU time, 228.5M memory peak. Mar 21 14:10:11.808130 systemd-logind[1456]: Session 9 logged out. Waiting for processes to exit. Mar 21 14:10:11.809288 systemd-logind[1456]: Removed session 9. Mar 21 14:10:14.353492 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1713015428.mount: Deactivated successfully. Mar 21 14:10:18.861649 containerd[1478]: time="2025-03-21T14:10:18.861572740Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:18.862918 containerd[1478]: time="2025-03-21T14:10:18.862869172Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 21 14:10:18.864195 containerd[1478]: time="2025-03-21T14:10:18.864147580Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:18.867717 containerd[1478]: time="2025-03-21T14:10:18.867661090Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:18.869037 containerd[1478]: time="2025-03-21T14:10:18.868933968Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 7.558269243s" Mar 21 14:10:18.869037 containerd[1478]: time="2025-03-21T14:10:18.868963432Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 21 14:10:18.871808 containerd[1478]: time="2025-03-21T14:10:18.871284973Z" level=info msg="CreateContainer within sandbox \"df58d80b643cff726614e4ee90f01ab7e12eafc194a5db7b8682c9977d701ba6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 21 14:10:18.888317 containerd[1478]: time="2025-03-21T14:10:18.887546561Z" level=info msg="Container 731fa2114d7e7fa302553f5ce3bd921a5715efd86d40a755e12d6f3f8ddc961a: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:18.900246 containerd[1478]: time="2025-03-21T14:10:18.900195945Z" level=info msg="CreateContainer within sandbox \"df58d80b643cff726614e4ee90f01ab7e12eafc194a5db7b8682c9977d701ba6\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"731fa2114d7e7fa302553f5ce3bd921a5715efd86d40a755e12d6f3f8ddc961a\"" Mar 21 14:10:18.900984 containerd[1478]: time="2025-03-21T14:10:18.900915549Z" level=info msg="StartContainer for \"731fa2114d7e7fa302553f5ce3bd921a5715efd86d40a755e12d6f3f8ddc961a\"" Mar 21 14:10:18.902279 containerd[1478]: time="2025-03-21T14:10:18.902198636Z" level=info msg="connecting to shim 731fa2114d7e7fa302553f5ce3bd921a5715efd86d40a755e12d6f3f8ddc961a" address="unix:///run/containerd/s/d17b36eb3702035cd80b5cbe2205f73f15981b715cd1f7c7341755d29887237f" protocol=ttrpc version=3 Mar 21 14:10:18.936297 systemd[1]: Started cri-containerd-731fa2114d7e7fa302553f5ce3bd921a5715efd86d40a755e12d6f3f8ddc961a.scope - libcontainer container 731fa2114d7e7fa302553f5ce3bd921a5715efd86d40a755e12d6f3f8ddc961a. Mar 21 14:10:18.977037 containerd[1478]: time="2025-03-21T14:10:18.976839560Z" level=info msg="StartContainer for \"731fa2114d7e7fa302553f5ce3bd921a5715efd86d40a755e12d6f3f8ddc961a\" returns successfully" Mar 21 14:10:22.178041 kubelet[2670]: I0321 14:10:22.177974 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-s6wfq" podStartSLOduration=4.617638789 podStartE2EDuration="12.177955995s" podCreationTimestamp="2025-03-21 14:10:10 +0000 UTC" firstStartedPulling="2025-03-21 14:10:11.309739001 +0000 UTC m=+5.923914078" lastFinishedPulling="2025-03-21 14:10:18.870056197 +0000 UTC m=+13.484231284" observedRunningTime="2025-03-21 14:10:19.591092302 +0000 UTC m=+14.205267429" watchObservedRunningTime="2025-03-21 14:10:22.177955995 +0000 UTC m=+16.792131072" Mar 21 14:10:22.190024 systemd[1]: Created slice kubepods-besteffort-pod43e1468e_aaaf_4789_ad4f_b0ebdc20a9c7.slice - libcontainer container kubepods-besteffort-pod43e1468e_aaaf_4789_ad4f_b0ebdc20a9c7.slice. Mar 21 14:10:22.288405 kubelet[2670]: I0321 14:10:22.288369 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7-typha-certs\") pod \"calico-typha-667fb46888-64l6w\" (UID: \"43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7\") " pod="calico-system/calico-typha-667fb46888-64l6w" Mar 21 14:10:22.289355 kubelet[2670]: I0321 14:10:22.289316 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7-tigera-ca-bundle\") pod \"calico-typha-667fb46888-64l6w\" (UID: \"43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7\") " pod="calico-system/calico-typha-667fb46888-64l6w" Mar 21 14:10:22.289429 kubelet[2670]: I0321 14:10:22.289366 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvl2d\" (UniqueName: \"kubernetes.io/projected/43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7-kube-api-access-jvl2d\") pod \"calico-typha-667fb46888-64l6w\" (UID: \"43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7\") " pod="calico-system/calico-typha-667fb46888-64l6w" Mar 21 14:10:22.350464 systemd[1]: Created slice kubepods-besteffort-pod13e3aaa7_256a_40ad_bf4b_d03398e27191.slice - libcontainer container kubepods-besteffort-pod13e3aaa7_256a_40ad_bf4b_d03398e27191.slice. Mar 21 14:10:22.390058 kubelet[2670]: I0321 14:10:22.390017 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-lib-modules\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390058 kubelet[2670]: I0321 14:10:22.390061 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/13e3aaa7-256a-40ad-bf4b-d03398e27191-node-certs\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390296 kubelet[2670]: I0321 14:10:22.390082 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-var-run-calico\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390296 kubelet[2670]: I0321 14:10:22.390144 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-var-lib-calico\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390296 kubelet[2670]: I0321 14:10:22.390166 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-cni-net-dir\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390296 kubelet[2670]: I0321 14:10:22.390197 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13e3aaa7-256a-40ad-bf4b-d03398e27191-tigera-ca-bundle\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390296 kubelet[2670]: I0321 14:10:22.390220 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-flexvol-driver-host\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390437 kubelet[2670]: I0321 14:10:22.390238 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-xtables-lock\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390437 kubelet[2670]: I0321 14:10:22.390258 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfk46\" (UniqueName: \"kubernetes.io/projected/13e3aaa7-256a-40ad-bf4b-d03398e27191-kube-api-access-xfk46\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390437 kubelet[2670]: I0321 14:10:22.390276 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-cni-bin-dir\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390437 kubelet[2670]: I0321 14:10:22.390293 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-policysync\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.390437 kubelet[2670]: I0321 14:10:22.390323 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/13e3aaa7-256a-40ad-bf4b-d03398e27191-cni-log-dir\") pod \"calico-node-pbxct\" (UID: \"13e3aaa7-256a-40ad-bf4b-d03398e27191\") " pod="calico-system/calico-node-pbxct" Mar 21 14:10:22.498528 containerd[1478]: time="2025-03-21T14:10:22.496406730Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-667fb46888-64l6w,Uid:43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7,Namespace:calico-system,Attempt:0,}" Mar 21 14:10:22.498895 kubelet[2670]: E0321 14:10:22.498305 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.498895 kubelet[2670]: W0321 14:10:22.498328 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.498895 kubelet[2670]: E0321 14:10:22.498369 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.499835 kubelet[2670]: E0321 14:10:22.499413 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.499835 kubelet[2670]: W0321 14:10:22.499431 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.499835 kubelet[2670]: E0321 14:10:22.499712 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.502158 kubelet[2670]: E0321 14:10:22.502071 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.502158 kubelet[2670]: W0321 14:10:22.502089 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.503863 kubelet[2670]: E0321 14:10:22.503838 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.505242 kubelet[2670]: E0321 14:10:22.505211 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.505242 kubelet[2670]: W0321 14:10:22.505226 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.506361 kubelet[2670]: E0321 14:10:22.506281 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.506482 kubelet[2670]: E0321 14:10:22.506402 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.506482 kubelet[2670]: W0321 14:10:22.506413 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.506765 kubelet[2670]: E0321 14:10:22.506528 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.507885 kubelet[2670]: E0321 14:10:22.507842 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.507885 kubelet[2670]: W0321 14:10:22.507858 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.508204 kubelet[2670]: E0321 14:10:22.508106 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.513135 kubelet[2670]: E0321 14:10:22.513066 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.513135 kubelet[2670]: W0321 14:10:22.513090 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.513612 kubelet[2670]: E0321 14:10:22.513273 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.513724 kubelet[2670]: E0321 14:10:22.513704 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.513724 kubelet[2670]: W0321 14:10:22.513720 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.514008 kubelet[2670]: E0321 14:10:22.513975 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.516830 kubelet[2670]: E0321 14:10:22.516797 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.516830 kubelet[2670]: W0321 14:10:22.516819 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.517263 kubelet[2670]: E0321 14:10:22.517242 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.517263 kubelet[2670]: W0321 14:10:22.517258 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.518495 kubelet[2670]: E0321 14:10:22.517608 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.518495 kubelet[2670]: E0321 14:10:22.517640 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.518495 kubelet[2670]: E0321 14:10:22.518245 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.518495 kubelet[2670]: W0321 14:10:22.518254 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.518495 kubelet[2670]: E0321 14:10:22.518270 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.518815 kubelet[2670]: E0321 14:10:22.518794 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.518815 kubelet[2670]: W0321 14:10:22.518809 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.520153 kubelet[2670]: E0321 14:10:22.518955 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.520437 kubelet[2670]: E0321 14:10:22.520397 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.520538 kubelet[2670]: W0321 14:10:22.520521 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.520633 kubelet[2670]: E0321 14:10:22.520611 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.530952 kubelet[2670]: E0321 14:10:22.530836 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.530952 kubelet[2670]: W0321 14:10:22.530879 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.530952 kubelet[2670]: E0321 14:10:22.530898 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.534955 kubelet[2670]: E0321 14:10:22.534676 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:22.567375 containerd[1478]: time="2025-03-21T14:10:22.567276545Z" level=info msg="connecting to shim 16ce464286cf0e4896b5710a1eb9bf1e65940f774a8fba0b07ddc09f1cbb25a7" address="unix:///run/containerd/s/7b6f4451e0a788497d06cf5e8281e66abd756a20def2ad254985b9cc79df6ff9" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:22.576601 kubelet[2670]: E0321 14:10:22.576287 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.576601 kubelet[2670]: W0321 14:10:22.576308 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.576601 kubelet[2670]: E0321 14:10:22.576327 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.577411 kubelet[2670]: E0321 14:10:22.577200 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.577411 kubelet[2670]: W0321 14:10:22.577212 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.577411 kubelet[2670]: E0321 14:10:22.577248 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.577836 kubelet[2670]: E0321 14:10:22.577474 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.577836 kubelet[2670]: W0321 14:10:22.577594 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.577836 kubelet[2670]: E0321 14:10:22.577604 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.578392 kubelet[2670]: E0321 14:10:22.578297 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.578392 kubelet[2670]: W0321 14:10:22.578309 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.578392 kubelet[2670]: E0321 14:10:22.578320 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.578726 kubelet[2670]: E0321 14:10:22.578682 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.578726 kubelet[2670]: W0321 14:10:22.578694 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.578913 kubelet[2670]: E0321 14:10:22.578705 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.579292 kubelet[2670]: E0321 14:10:22.579126 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.579292 kubelet[2670]: W0321 14:10:22.579139 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.579292 kubelet[2670]: E0321 14:10:22.579216 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.579537 kubelet[2670]: E0321 14:10:22.579518 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.579708 kubelet[2670]: W0321 14:10:22.579618 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.579708 kubelet[2670]: E0321 14:10:22.579634 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.580011 kubelet[2670]: E0321 14:10:22.579924 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.580011 kubelet[2670]: W0321 14:10:22.579936 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.580011 kubelet[2670]: E0321 14:10:22.579946 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.580352 kubelet[2670]: E0321 14:10:22.580227 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.580352 kubelet[2670]: W0321 14:10:22.580239 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.580352 kubelet[2670]: E0321 14:10:22.580248 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.580572 kubelet[2670]: E0321 14:10:22.580487 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.580572 kubelet[2670]: W0321 14:10:22.580499 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.580572 kubelet[2670]: E0321 14:10:22.580508 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.580857 kubelet[2670]: E0321 14:10:22.580777 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.580857 kubelet[2670]: W0321 14:10:22.580787 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.580857 kubelet[2670]: E0321 14:10:22.580796 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.581108 kubelet[2670]: E0321 14:10:22.581026 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.581108 kubelet[2670]: W0321 14:10:22.581036 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.581108 kubelet[2670]: E0321 14:10:22.581045 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.581665 kubelet[2670]: E0321 14:10:22.581566 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.581665 kubelet[2670]: W0321 14:10:22.581587 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.581665 kubelet[2670]: E0321 14:10:22.581597 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.581847 kubelet[2670]: E0321 14:10:22.581822 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.582003 kubelet[2670]: W0321 14:10:22.581901 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.582003 kubelet[2670]: E0321 14:10:22.581915 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.582215 kubelet[2670]: E0321 14:10:22.582203 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.582349 kubelet[2670]: W0321 14:10:22.582274 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.582349 kubelet[2670]: E0321 14:10:22.582289 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.582717 kubelet[2670]: E0321 14:10:22.582618 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.582717 kubelet[2670]: W0321 14:10:22.582630 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.582717 kubelet[2670]: E0321 14:10:22.582640 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.582885 kubelet[2670]: E0321 14:10:22.582873 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.582953 kubelet[2670]: W0321 14:10:22.582942 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.583077 kubelet[2670]: E0321 14:10:22.583004 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.583233 kubelet[2670]: E0321 14:10:22.583221 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.583376 kubelet[2670]: W0321 14:10:22.583289 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.583376 kubelet[2670]: E0321 14:10:22.583303 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.583520 kubelet[2670]: E0321 14:10:22.583508 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.583585 kubelet[2670]: W0321 14:10:22.583574 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.583712 kubelet[2670]: E0321 14:10:22.583635 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.583945 kubelet[2670]: E0321 14:10:22.583933 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.584161 kubelet[2670]: W0321 14:10:22.584017 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.584161 kubelet[2670]: E0321 14:10:22.584033 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.592353 kubelet[2670]: E0321 14:10:22.592267 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.592353 kubelet[2670]: W0321 14:10:22.592290 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.592353 kubelet[2670]: E0321 14:10:22.592310 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.592353 kubelet[2670]: I0321 14:10:22.592342 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d174cc7d-f7aa-43cf-981d-eab1c74e7f73-varrun\") pod \"csi-node-driver-hjpgs\" (UID: \"d174cc7d-f7aa-43cf-981d-eab1c74e7f73\") " pod="calico-system/csi-node-driver-hjpgs" Mar 21 14:10:22.593217 kubelet[2670]: E0321 14:10:22.592524 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.593217 kubelet[2670]: W0321 14:10:22.592535 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.593217 kubelet[2670]: E0321 14:10:22.592548 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.593217 kubelet[2670]: I0321 14:10:22.592564 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d174cc7d-f7aa-43cf-981d-eab1c74e7f73-kubelet-dir\") pod \"csi-node-driver-hjpgs\" (UID: \"d174cc7d-f7aa-43cf-981d-eab1c74e7f73\") " pod="calico-system/csi-node-driver-hjpgs" Mar 21 14:10:22.593217 kubelet[2670]: E0321 14:10:22.592722 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.593217 kubelet[2670]: W0321 14:10:22.592734 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.593217 kubelet[2670]: E0321 14:10:22.592743 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.593217 kubelet[2670]: I0321 14:10:22.592760 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g686k\" (UniqueName: \"kubernetes.io/projected/d174cc7d-f7aa-43cf-981d-eab1c74e7f73-kube-api-access-g686k\") pod \"csi-node-driver-hjpgs\" (UID: \"d174cc7d-f7aa-43cf-981d-eab1c74e7f73\") " pod="calico-system/csi-node-driver-hjpgs" Mar 21 14:10:22.593217 kubelet[2670]: E0321 14:10:22.593152 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.593449 kubelet[2670]: W0321 14:10:22.593166 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.593449 kubelet[2670]: E0321 14:10:22.593178 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.593449 kubelet[2670]: I0321 14:10:22.593199 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d174cc7d-f7aa-43cf-981d-eab1c74e7f73-registration-dir\") pod \"csi-node-driver-hjpgs\" (UID: \"d174cc7d-f7aa-43cf-981d-eab1c74e7f73\") " pod="calico-system/csi-node-driver-hjpgs" Mar 21 14:10:22.593449 kubelet[2670]: E0321 14:10:22.593350 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.593449 kubelet[2670]: W0321 14:10:22.593360 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.593449 kubelet[2670]: E0321 14:10:22.593369 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.593449 kubelet[2670]: I0321 14:10:22.593385 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d174cc7d-f7aa-43cf-981d-eab1c74e7f73-socket-dir\") pod \"csi-node-driver-hjpgs\" (UID: \"d174cc7d-f7aa-43cf-981d-eab1c74e7f73\") " pod="calico-system/csi-node-driver-hjpgs" Mar 21 14:10:22.593679 kubelet[2670]: E0321 14:10:22.593542 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.593679 kubelet[2670]: W0321 14:10:22.593552 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.593679 kubelet[2670]: E0321 14:10:22.593562 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.593758 kubelet[2670]: E0321 14:10:22.593693 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.593758 kubelet[2670]: W0321 14:10:22.593702 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.593758 kubelet[2670]: E0321 14:10:22.593711 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.594957 kubelet[2670]: E0321 14:10:22.593843 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.594957 kubelet[2670]: W0321 14:10:22.593858 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.594957 kubelet[2670]: E0321 14:10:22.593870 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.594957 kubelet[2670]: E0321 14:10:22.593995 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.594957 kubelet[2670]: W0321 14:10:22.594004 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.594957 kubelet[2670]: E0321 14:10:22.594012 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.594957 kubelet[2670]: E0321 14:10:22.594887 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.594957 kubelet[2670]: W0321 14:10:22.594898 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.594957 kubelet[2670]: E0321 14:10:22.594912 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.596581 kubelet[2670]: E0321 14:10:22.595198 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.596581 kubelet[2670]: W0321 14:10:22.595209 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.596581 kubelet[2670]: E0321 14:10:22.595306 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.596965 kubelet[2670]: E0321 14:10:22.596750 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.596965 kubelet[2670]: W0321 14:10:22.596765 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.596965 kubelet[2670]: E0321 14:10:22.596795 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.597386 kubelet[2670]: E0321 14:10:22.597321 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.597386 kubelet[2670]: W0321 14:10:22.597334 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.597386 kubelet[2670]: E0321 14:10:22.597352 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.599159 kubelet[2670]: E0321 14:10:22.599090 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.599159 kubelet[2670]: W0321 14:10:22.599136 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.599159 kubelet[2670]: E0321 14:10:22.599155 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.599676 kubelet[2670]: E0321 14:10:22.599403 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.599676 kubelet[2670]: W0321 14:10:22.599413 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.599676 kubelet[2670]: E0321 14:10:22.599422 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.610917 systemd[1]: Started cri-containerd-16ce464286cf0e4896b5710a1eb9bf1e65940f774a8fba0b07ddc09f1cbb25a7.scope - libcontainer container 16ce464286cf0e4896b5710a1eb9bf1e65940f774a8fba0b07ddc09f1cbb25a7. Mar 21 14:10:22.656847 containerd[1478]: time="2025-03-21T14:10:22.656570600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pbxct,Uid:13e3aaa7-256a-40ad-bf4b-d03398e27191,Namespace:calico-system,Attempt:0,}" Mar 21 14:10:22.693524 containerd[1478]: time="2025-03-21T14:10:22.693347627Z" level=info msg="connecting to shim a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad" address="unix:///run/containerd/s/3e567712a613c8f48cf4002c1f837d661f1ae2e820307991d4ceee2cfc671e3a" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:22.695408 kubelet[2670]: E0321 14:10:22.694924 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.695408 kubelet[2670]: W0321 14:10:22.695348 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.695408 kubelet[2670]: E0321 14:10:22.695374 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.695933 kubelet[2670]: E0321 14:10:22.695882 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.695933 kubelet[2670]: W0321 14:10:22.695913 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.696319 kubelet[2670]: E0321 14:10:22.696060 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.697193 kubelet[2670]: E0321 14:10:22.696667 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.697193 kubelet[2670]: W0321 14:10:22.696696 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.697193 kubelet[2670]: E0321 14:10:22.696721 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.697967 kubelet[2670]: E0321 14:10:22.697466 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.698135 kubelet[2670]: W0321 14:10:22.697478 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.698135 kubelet[2670]: E0321 14:10:22.698065 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.698818 kubelet[2670]: E0321 14:10:22.698293 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.698818 kubelet[2670]: W0321 14:10:22.698306 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.699038 kubelet[2670]: E0321 14:10:22.698938 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.699646 kubelet[2670]: E0321 14:10:22.699458 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.699646 kubelet[2670]: W0321 14:10:22.699470 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.700226 kubelet[2670]: E0321 14:10:22.700037 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.700481 kubelet[2670]: E0321 14:10:22.700107 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.700481 kubelet[2670]: W0321 14:10:22.700414 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.700481 kubelet[2670]: E0321 14:10:22.700456 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.702353 kubelet[2670]: E0321 14:10:22.701918 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.702353 kubelet[2670]: W0321 14:10:22.701931 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.702488 kubelet[2670]: E0321 14:10:22.702476 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.702551 kubelet[2670]: W0321 14:10:22.702540 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.704058 kubelet[2670]: E0321 14:10:22.703774 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.704434 kubelet[2670]: E0321 14:10:22.704211 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.704543 kubelet[2670]: W0321 14:10:22.704529 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.704967 kubelet[2670]: E0321 14:10:22.704956 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.705623 kubelet[2670]: W0321 14:10:22.705566 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.705833 kubelet[2670]: E0321 14:10:22.705133 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.705833 kubelet[2670]: E0321 14:10:22.704327 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.705833 kubelet[2670]: E0321 14:10:22.705715 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.706622 kubelet[2670]: E0321 14:10:22.706477 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.706622 kubelet[2670]: W0321 14:10:22.706488 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.706622 kubelet[2670]: E0321 14:10:22.706524 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.706814 kubelet[2670]: E0321 14:10:22.706726 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.706814 kubelet[2670]: W0321 14:10:22.706736 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.707108 kubelet[2670]: E0321 14:10:22.707061 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.708212 kubelet[2670]: E0321 14:10:22.708094 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.708212 kubelet[2670]: W0321 14:10:22.708158 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.708212 kubelet[2670]: E0321 14:10:22.708213 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.708802 kubelet[2670]: E0321 14:10:22.708674 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.708802 kubelet[2670]: W0321 14:10:22.708727 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.708802 kubelet[2670]: E0321 14:10:22.708760 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.709213 kubelet[2670]: E0321 14:10:22.709178 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.709213 kubelet[2670]: W0321 14:10:22.709189 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.709477 kubelet[2670]: E0321 14:10:22.709396 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.709773 kubelet[2670]: E0321 14:10:22.709738 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.709773 kubelet[2670]: W0321 14:10:22.709750 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.710071 kubelet[2670]: E0321 14:10:22.709967 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.710324 kubelet[2670]: E0321 14:10:22.710277 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.710324 kubelet[2670]: W0321 14:10:22.710292 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.711285 kubelet[2670]: E0321 14:10:22.710501 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.711419 kubelet[2670]: E0321 14:10:22.711406 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.711556 kubelet[2670]: W0321 14:10:22.711473 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.711646 kubelet[2670]: E0321 14:10:22.711621 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.711824 kubelet[2670]: E0321 14:10:22.711762 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.711824 kubelet[2670]: W0321 14:10:22.711773 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.711896 kubelet[2670]: E0321 14:10:22.711816 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.712793 kubelet[2670]: E0321 14:10:22.712580 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.712793 kubelet[2670]: W0321 14:10:22.712593 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.712793 kubelet[2670]: E0321 14:10:22.712624 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.713250 kubelet[2670]: E0321 14:10:22.713181 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.713250 kubelet[2670]: W0321 14:10:22.713194 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.713250 kubelet[2670]: E0321 14:10:22.713211 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.714221 kubelet[2670]: E0321 14:10:22.714185 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.714221 kubelet[2670]: W0321 14:10:22.714209 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.714965 kubelet[2670]: E0321 14:10:22.714236 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.716619 kubelet[2670]: E0321 14:10:22.716595 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.716684 kubelet[2670]: W0321 14:10:22.716620 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.716824 kubelet[2670]: E0321 14:10:22.716741 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.717474 kubelet[2670]: E0321 14:10:22.717437 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.717474 kubelet[2670]: W0321 14:10:22.717448 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.717474 kubelet[2670]: E0321 14:10:22.717465 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.727653 kubelet[2670]: E0321 14:10:22.727481 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:22.727653 kubelet[2670]: W0321 14:10:22.727503 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:22.727653 kubelet[2670]: E0321 14:10:22.727577 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:22.735727 containerd[1478]: time="2025-03-21T14:10:22.735634119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-667fb46888-64l6w,Uid:43e1468e-aaaf-4789-ad4f-b0ebdc20a9c7,Namespace:calico-system,Attempt:0,} returns sandbox id \"16ce464286cf0e4896b5710a1eb9bf1e65940f774a8fba0b07ddc09f1cbb25a7\"" Mar 21 14:10:22.738516 containerd[1478]: time="2025-03-21T14:10:22.738427749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 21 14:10:22.763449 systemd[1]: Started cri-containerd-a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad.scope - libcontainer container a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad. Mar 21 14:10:22.810074 containerd[1478]: time="2025-03-21T14:10:22.809775193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pbxct,Uid:13e3aaa7-256a-40ad-bf4b-d03398e27191,Namespace:calico-system,Attempt:0,} returns sandbox id \"a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad\"" Mar 21 14:10:24.479650 kubelet[2670]: E0321 14:10:24.479397 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:25.972954 containerd[1478]: time="2025-03-21T14:10:25.972784329Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:25.974142 containerd[1478]: time="2025-03-21T14:10:25.973984033Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 21 14:10:25.975512 containerd[1478]: time="2025-03-21T14:10:25.975468485Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:25.978695 containerd[1478]: time="2025-03-21T14:10:25.978613397Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:25.979174 containerd[1478]: time="2025-03-21T14:10:25.979146760Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 3.240666374s" Mar 21 14:10:25.979478 containerd[1478]: time="2025-03-21T14:10:25.979175874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 21 14:10:25.986183 containerd[1478]: time="2025-03-21T14:10:25.986142168Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 21 14:10:25.996733 containerd[1478]: time="2025-03-21T14:10:25.995803918Z" level=info msg="CreateContainer within sandbox \"16ce464286cf0e4896b5710a1eb9bf1e65940f774a8fba0b07ddc09f1cbb25a7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 21 14:10:26.008401 containerd[1478]: time="2025-03-21T14:10:26.007327108Z" level=info msg="Container e2fc6d00ea0b6b6d65d48198c467ec7182455a6339f4058b4a186d0d7d2974a3: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:26.011906 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4210279871.mount: Deactivated successfully. Mar 21 14:10:26.022347 containerd[1478]: time="2025-03-21T14:10:26.022215127Z" level=info msg="CreateContainer within sandbox \"16ce464286cf0e4896b5710a1eb9bf1e65940f774a8fba0b07ddc09f1cbb25a7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e2fc6d00ea0b6b6d65d48198c467ec7182455a6339f4058b4a186d0d7d2974a3\"" Mar 21 14:10:26.024435 containerd[1478]: time="2025-03-21T14:10:26.024375630Z" level=info msg="StartContainer for \"e2fc6d00ea0b6b6d65d48198c467ec7182455a6339f4058b4a186d0d7d2974a3\"" Mar 21 14:10:26.025710 containerd[1478]: time="2025-03-21T14:10:26.025651115Z" level=info msg="connecting to shim e2fc6d00ea0b6b6d65d48198c467ec7182455a6339f4058b4a186d0d7d2974a3" address="unix:///run/containerd/s/7b6f4451e0a788497d06cf5e8281e66abd756a20def2ad254985b9cc79df6ff9" protocol=ttrpc version=3 Mar 21 14:10:26.051249 systemd[1]: Started cri-containerd-e2fc6d00ea0b6b6d65d48198c467ec7182455a6339f4058b4a186d0d7d2974a3.scope - libcontainer container e2fc6d00ea0b6b6d65d48198c467ec7182455a6339f4058b4a186d0d7d2974a3. Mar 21 14:10:26.118917 containerd[1478]: time="2025-03-21T14:10:26.118880636Z" level=info msg="StartContainer for \"e2fc6d00ea0b6b6d65d48198c467ec7182455a6339f4058b4a186d0d7d2974a3\" returns successfully" Mar 21 14:10:26.479166 kubelet[2670]: E0321 14:10:26.479083 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:26.614642 kubelet[2670]: E0321 14:10:26.614207 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.614642 kubelet[2670]: W0321 14:10:26.614226 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.614642 kubelet[2670]: E0321 14:10:26.614242 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.614642 kubelet[2670]: E0321 14:10:26.614386 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.614642 kubelet[2670]: W0321 14:10:26.614395 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.614642 kubelet[2670]: E0321 14:10:26.614404 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.614642 kubelet[2670]: E0321 14:10:26.614558 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.614642 kubelet[2670]: W0321 14:10:26.614566 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.614642 kubelet[2670]: E0321 14:10:26.614576 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.615347 kubelet[2670]: E0321 14:10:26.615059 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.615347 kubelet[2670]: W0321 14:10:26.615070 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.615347 kubelet[2670]: E0321 14:10:26.615080 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.615347 kubelet[2670]: E0321 14:10:26.615258 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.615347 kubelet[2670]: W0321 14:10:26.615267 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.615347 kubelet[2670]: E0321 14:10:26.615276 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.615821 kubelet[2670]: E0321 14:10:26.615633 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.615821 kubelet[2670]: W0321 14:10:26.615645 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.615821 kubelet[2670]: E0321 14:10:26.615655 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.616097 kubelet[2670]: E0321 14:10:26.615982 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.616097 kubelet[2670]: W0321 14:10:26.615993 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.616097 kubelet[2670]: E0321 14:10:26.616004 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.616514 kubelet[2670]: E0321 14:10:26.616346 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.616514 kubelet[2670]: W0321 14:10:26.616360 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.616514 kubelet[2670]: E0321 14:10:26.616372 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.616723 kubelet[2670]: E0321 14:10:26.616637 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.616723 kubelet[2670]: W0321 14:10:26.616650 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.616723 kubelet[2670]: E0321 14:10:26.616659 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.617067 kubelet[2670]: E0321 14:10:26.616915 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.617067 kubelet[2670]: W0321 14:10:26.616926 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.617067 kubelet[2670]: E0321 14:10:26.616935 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.617330 kubelet[2670]: E0321 14:10:26.617232 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.617330 kubelet[2670]: W0321 14:10:26.617243 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.617330 kubelet[2670]: E0321 14:10:26.617254 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.617708 kubelet[2670]: E0321 14:10:26.617532 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.617708 kubelet[2670]: W0321 14:10:26.617562 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.617708 kubelet[2670]: E0321 14:10:26.617571 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.617941 kubelet[2670]: E0321 14:10:26.617865 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.617941 kubelet[2670]: W0321 14:10:26.617876 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.617941 kubelet[2670]: E0321 14:10:26.617885 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.618265 kubelet[2670]: E0321 14:10:26.618141 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.618265 kubelet[2670]: W0321 14:10:26.618152 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.618265 kubelet[2670]: E0321 14:10:26.618161 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.618446 kubelet[2670]: E0321 14:10:26.618387 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.618446 kubelet[2670]: W0321 14:10:26.618399 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.618446 kubelet[2670]: E0321 14:10:26.618409 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.629716 kubelet[2670]: I0321 14:10:26.629657 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-667fb46888-64l6w" podStartSLOduration=1.387002625 podStartE2EDuration="4.629642298s" podCreationTimestamp="2025-03-21 14:10:22 +0000 UTC" firstStartedPulling="2025-03-21 14:10:22.737911941 +0000 UTC m=+17.352087018" lastFinishedPulling="2025-03-21 14:10:25.980551614 +0000 UTC m=+20.594726691" observedRunningTime="2025-03-21 14:10:26.629138671 +0000 UTC m=+21.243313758" watchObservedRunningTime="2025-03-21 14:10:26.629642298 +0000 UTC m=+21.243817385" Mar 21 14:10:26.637274 kubelet[2670]: E0321 14:10:26.636888 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.637274 kubelet[2670]: W0321 14:10:26.636908 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.637274 kubelet[2670]: E0321 14:10:26.637028 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.637804 kubelet[2670]: E0321 14:10:26.637624 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.637804 kubelet[2670]: W0321 14:10:26.637635 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.637804 kubelet[2670]: E0321 14:10:26.637651 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.638339 kubelet[2670]: E0321 14:10:26.637916 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.638339 kubelet[2670]: W0321 14:10:26.637935 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.638339 kubelet[2670]: E0321 14:10:26.637960 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.638339 kubelet[2670]: E0321 14:10:26.638102 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.638339 kubelet[2670]: W0321 14:10:26.638141 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.638339 kubelet[2670]: E0321 14:10:26.638152 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.638339 kubelet[2670]: E0321 14:10:26.638275 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.638339 kubelet[2670]: W0321 14:10:26.638283 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.638339 kubelet[2670]: E0321 14:10:26.638291 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.638836 kubelet[2670]: E0321 14:10:26.638445 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.638836 kubelet[2670]: W0321 14:10:26.638454 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.638836 kubelet[2670]: E0321 14:10:26.638470 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.640133 kubelet[2670]: E0321 14:10:26.639780 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.640133 kubelet[2670]: W0321 14:10:26.639796 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.640133 kubelet[2670]: E0321 14:10:26.639810 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.641419 kubelet[2670]: E0321 14:10:26.641239 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.641419 kubelet[2670]: W0321 14:10:26.641262 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.641419 kubelet[2670]: E0321 14:10:26.641285 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.642331 kubelet[2670]: E0321 14:10:26.642243 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.642331 kubelet[2670]: W0321 14:10:26.642255 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.642331 kubelet[2670]: E0321 14:10:26.642265 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.642736 kubelet[2670]: E0321 14:10:26.642680 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.642736 kubelet[2670]: W0321 14:10:26.642692 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.642736 kubelet[2670]: E0321 14:10:26.642705 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.643198 kubelet[2670]: E0321 14:10:26.643081 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.643198 kubelet[2670]: W0321 14:10:26.643137 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.643198 kubelet[2670]: E0321 14:10:26.643151 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.645141 kubelet[2670]: E0321 14:10:26.643706 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.645141 kubelet[2670]: W0321 14:10:26.643718 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.645141 kubelet[2670]: E0321 14:10:26.643735 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.645141 kubelet[2670]: E0321 14:10:26.644357 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.645141 kubelet[2670]: W0321 14:10:26.644374 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.645141 kubelet[2670]: E0321 14:10:26.644389 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.645141 kubelet[2670]: E0321 14:10:26.644527 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.645141 kubelet[2670]: W0321 14:10:26.644537 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.645141 kubelet[2670]: E0321 14:10:26.644547 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.645141 kubelet[2670]: E0321 14:10:26.644673 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.645409 kubelet[2670]: W0321 14:10:26.644682 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.645409 kubelet[2670]: E0321 14:10:26.644690 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.645409 kubelet[2670]: E0321 14:10:26.644838 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.645409 kubelet[2670]: W0321 14:10:26.644847 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.645409 kubelet[2670]: E0321 14:10:26.644970 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.645409 kubelet[2670]: W0321 14:10:26.644978 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.645409 kubelet[2670]: E0321 14:10:26.644987 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.645409 kubelet[2670]: E0321 14:10:26.645009 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:26.645409 kubelet[2670]: E0321 14:10:26.645332 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:26.645409 kubelet[2670]: W0321 14:10:26.645341 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:26.646020 kubelet[2670]: E0321 14:10:26.645350 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.619452 kubelet[2670]: I0321 14:10:27.618483 2670 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 14:10:27.628633 kubelet[2670]: E0321 14:10:27.627554 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.628633 kubelet[2670]: W0321 14:10:27.628169 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.628633 kubelet[2670]: E0321 14:10:27.628224 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.629006 kubelet[2670]: E0321 14:10:27.628961 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.629094 kubelet[2670]: W0321 14:10:27.629045 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.629216 kubelet[2670]: E0321 14:10:27.629075 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.629975 kubelet[2670]: E0321 14:10:27.629900 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.629975 kubelet[2670]: W0321 14:10:27.629933 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.630201 kubelet[2670]: E0321 14:10:27.629995 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.631021 kubelet[2670]: E0321 14:10:27.630764 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.631021 kubelet[2670]: W0321 14:10:27.630818 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.631021 kubelet[2670]: E0321 14:10:27.630844 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.632177 kubelet[2670]: E0321 14:10:27.631511 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.632177 kubelet[2670]: W0321 14:10:27.631564 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.632177 kubelet[2670]: E0321 14:10:27.631589 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.632721 kubelet[2670]: E0321 14:10:27.632202 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.632721 kubelet[2670]: W0321 14:10:27.632226 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.632721 kubelet[2670]: E0321 14:10:27.632249 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.632964 kubelet[2670]: E0321 14:10:27.632776 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.632964 kubelet[2670]: W0321 14:10:27.632799 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.632964 kubelet[2670]: E0321 14:10:27.632821 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.634197 kubelet[2670]: E0321 14:10:27.633362 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.634197 kubelet[2670]: W0321 14:10:27.633397 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.634197 kubelet[2670]: E0321 14:10:27.633421 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.634197 kubelet[2670]: E0321 14:10:27.633970 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.634197 kubelet[2670]: W0321 14:10:27.634019 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.634197 kubelet[2670]: E0321 14:10:27.634044 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.634676 kubelet[2670]: E0321 14:10:27.634613 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.634676 kubelet[2670]: W0321 14:10:27.634635 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.634676 kubelet[2670]: E0321 14:10:27.634657 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.635978 kubelet[2670]: E0321 14:10:27.635165 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.635978 kubelet[2670]: W0321 14:10:27.635209 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.635978 kubelet[2670]: E0321 14:10:27.635233 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.635978 kubelet[2670]: E0321 14:10:27.635675 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.635978 kubelet[2670]: W0321 14:10:27.635697 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.635978 kubelet[2670]: E0321 14:10:27.635718 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.636422 kubelet[2670]: E0321 14:10:27.636266 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.636422 kubelet[2670]: W0321 14:10:27.636288 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.636422 kubelet[2670]: E0321 14:10:27.636311 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.637009 kubelet[2670]: E0321 14:10:27.636811 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.637009 kubelet[2670]: W0321 14:10:27.636843 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.637009 kubelet[2670]: E0321 14:10:27.636865 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.638401 kubelet[2670]: E0321 14:10:27.637472 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.638401 kubelet[2670]: W0321 14:10:27.637494 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.638401 kubelet[2670]: E0321 14:10:27.637517 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.648913 kubelet[2670]: E0321 14:10:27.648857 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.648913 kubelet[2670]: W0321 14:10:27.648882 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.648913 kubelet[2670]: E0321 14:10:27.648899 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.649252 kubelet[2670]: E0321 14:10:27.649186 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.649252 kubelet[2670]: W0321 14:10:27.649195 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.649252 kubelet[2670]: E0321 14:10:27.649216 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.649477 kubelet[2670]: E0321 14:10:27.649432 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.649477 kubelet[2670]: W0321 14:10:27.649442 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.649477 kubelet[2670]: E0321 14:10:27.649463 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.649729 kubelet[2670]: E0321 14:10:27.649659 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.649729 kubelet[2670]: W0321 14:10:27.649668 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.649729 kubelet[2670]: E0321 14:10:27.649680 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.650170 kubelet[2670]: E0321 14:10:27.649918 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.650170 kubelet[2670]: W0321 14:10:27.649927 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.650170 kubelet[2670]: E0321 14:10:27.649939 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.650170 kubelet[2670]: E0321 14:10:27.650128 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.650170 kubelet[2670]: W0321 14:10:27.650137 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.650670 kubelet[2670]: E0321 14:10:27.650236 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.650670 kubelet[2670]: E0321 14:10:27.650359 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.650670 kubelet[2670]: W0321 14:10:27.650367 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.650670 kubelet[2670]: E0321 14:10:27.650455 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.650670 kubelet[2670]: E0321 14:10:27.650657 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.650670 kubelet[2670]: W0321 14:10:27.650665 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.651047 kubelet[2670]: E0321 14:10:27.650751 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.651047 kubelet[2670]: E0321 14:10:27.650874 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.651047 kubelet[2670]: W0321 14:10:27.650883 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.651047 kubelet[2670]: E0321 14:10:27.650895 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.651408 kubelet[2670]: E0321 14:10:27.651362 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.651408 kubelet[2670]: W0321 14:10:27.651379 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.651408 kubelet[2670]: E0321 14:10:27.651392 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.651716 kubelet[2670]: E0321 14:10:27.651580 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.651716 kubelet[2670]: W0321 14:10:27.651589 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.651716 kubelet[2670]: E0321 14:10:27.651612 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.651897 kubelet[2670]: E0321 14:10:27.651798 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.651897 kubelet[2670]: W0321 14:10:27.651807 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.651897 kubelet[2670]: E0321 14:10:27.651829 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.652082 kubelet[2670]: E0321 14:10:27.652022 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.652082 kubelet[2670]: W0321 14:10:27.652033 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.652300 kubelet[2670]: E0321 14:10:27.652124 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.652427 kubelet[2670]: E0321 14:10:27.652395 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.652427 kubelet[2670]: W0321 14:10:27.652410 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.652596 kubelet[2670]: E0321 14:10:27.652437 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.652672 kubelet[2670]: E0321 14:10:27.652619 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.652672 kubelet[2670]: W0321 14:10:27.652629 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.652672 kubelet[2670]: E0321 14:10:27.652651 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.652872 kubelet[2670]: E0321 14:10:27.652853 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.652872 kubelet[2670]: W0321 14:10:27.652863 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.653026 kubelet[2670]: E0321 14:10:27.652885 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.653352 kubelet[2670]: E0321 14:10:27.653304 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.653352 kubelet[2670]: W0321 14:10:27.653322 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.653352 kubelet[2670]: E0321 14:10:27.653344 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:27.653578 kubelet[2670]: E0321 14:10:27.653523 2670 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 14:10:27.653578 kubelet[2670]: W0321 14:10:27.653533 2670 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 14:10:27.653578 kubelet[2670]: E0321 14:10:27.653542 2670 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 14:10:28.045038 containerd[1478]: time="2025-03-21T14:10:28.044836380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:28.047507 containerd[1478]: time="2025-03-21T14:10:28.047210425Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 21 14:10:28.048375 containerd[1478]: time="2025-03-21T14:10:28.048287692Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:28.050598 containerd[1478]: time="2025-03-21T14:10:28.050550529Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:28.051600 containerd[1478]: time="2025-03-21T14:10:28.051147611Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 2.064964066s" Mar 21 14:10:28.051600 containerd[1478]: time="2025-03-21T14:10:28.051182977Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 21 14:10:28.053300 containerd[1478]: time="2025-03-21T14:10:28.053254708Z" level=info msg="CreateContainer within sandbox \"a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 21 14:10:28.072193 containerd[1478]: time="2025-03-21T14:10:28.071659886Z" level=info msg="Container b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:28.088420 containerd[1478]: time="2025-03-21T14:10:28.088377200Z" level=info msg="CreateContainer within sandbox \"a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19\"" Mar 21 14:10:28.093230 containerd[1478]: time="2025-03-21T14:10:28.091178960Z" level=info msg="StartContainer for \"b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19\"" Mar 21 14:10:28.094671 containerd[1478]: time="2025-03-21T14:10:28.094626646Z" level=info msg="connecting to shim b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19" address="unix:///run/containerd/s/3e567712a613c8f48cf4002c1f837d661f1ae2e820307991d4ceee2cfc671e3a" protocol=ttrpc version=3 Mar 21 14:10:28.121280 systemd[1]: Started cri-containerd-b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19.scope - libcontainer container b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19. Mar 21 14:10:28.165992 containerd[1478]: time="2025-03-21T14:10:28.165868667Z" level=info msg="StartContainer for \"b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19\" returns successfully" Mar 21 14:10:28.173390 systemd[1]: cri-containerd-b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19.scope: Deactivated successfully. Mar 21 14:10:28.177432 containerd[1478]: time="2025-03-21T14:10:28.177389185Z" level=info msg="received exit event container_id:\"b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19\" id:\"b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19\" pid:3356 exited_at:{seconds:1742566228 nanos:176565219}" Mar 21 14:10:28.177559 containerd[1478]: time="2025-03-21T14:10:28.177470596Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19\" id:\"b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19\" pid:3356 exited_at:{seconds:1742566228 nanos:176565219}" Mar 21 14:10:28.200738 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b82b85837cc5cb31b80fa24e1690426e798204dc9f11abaa350ea6a7edcade19-rootfs.mount: Deactivated successfully. Mar 21 14:10:28.478755 kubelet[2670]: E0321 14:10:28.478657 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:29.639510 containerd[1478]: time="2025-03-21T14:10:29.639434323Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 21 14:10:30.479296 kubelet[2670]: E0321 14:10:30.479214 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:32.479335 kubelet[2670]: E0321 14:10:32.478507 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:34.478359 kubelet[2670]: E0321 14:10:34.478305 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:35.794465 containerd[1478]: time="2025-03-21T14:10:35.794053759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:35.796687 containerd[1478]: time="2025-03-21T14:10:35.796235164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 21 14:10:35.798676 containerd[1478]: time="2025-03-21T14:10:35.798461302Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:35.804292 containerd[1478]: time="2025-03-21T14:10:35.804237591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:35.806163 containerd[1478]: time="2025-03-21T14:10:35.805849482Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 6.164139226s" Mar 21 14:10:35.806163 containerd[1478]: time="2025-03-21T14:10:35.805914874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 21 14:10:35.811149 containerd[1478]: time="2025-03-21T14:10:35.810921145Z" level=info msg="CreateContainer within sandbox \"a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 21 14:10:35.831210 containerd[1478]: time="2025-03-21T14:10:35.830756646Z" level=info msg="Container 704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:35.860734 containerd[1478]: time="2025-03-21T14:10:35.860471921Z" level=info msg="CreateContainer within sandbox \"a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717\"" Mar 21 14:10:35.863056 containerd[1478]: time="2025-03-21T14:10:35.863017745Z" level=info msg="StartContainer for \"704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717\"" Mar 21 14:10:35.869159 containerd[1478]: time="2025-03-21T14:10:35.868243857Z" level=info msg="connecting to shim 704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717" address="unix:///run/containerd/s/3e567712a613c8f48cf4002c1f837d661f1ae2e820307991d4ceee2cfc671e3a" protocol=ttrpc version=3 Mar 21 14:10:35.901429 systemd[1]: Started cri-containerd-704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717.scope - libcontainer container 704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717. Mar 21 14:10:35.947458 containerd[1478]: time="2025-03-21T14:10:35.947421801Z" level=info msg="StartContainer for \"704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717\" returns successfully" Mar 21 14:10:36.478681 kubelet[2670]: E0321 14:10:36.478630 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:37.191482 kubelet[2670]: I0321 14:10:37.191322 2670 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 14:10:37.339204 containerd[1478]: time="2025-03-21T14:10:37.338874640Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 21 14:10:37.341226 systemd[1]: cri-containerd-704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717.scope: Deactivated successfully. Mar 21 14:10:37.341642 systemd[1]: cri-containerd-704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717.scope: Consumed 676ms CPU time, 172.1M memory peak, 154M written to disk. Mar 21 14:10:37.343810 containerd[1478]: time="2025-03-21T14:10:37.343731536Z" level=info msg="TaskExit event in podsandbox handler container_id:\"704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717\" id:\"704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717\" pid:3413 exited_at:{seconds:1742566237 nanos:342975414}" Mar 21 14:10:37.343886 containerd[1478]: time="2025-03-21T14:10:37.343803031Z" level=info msg="received exit event container_id:\"704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717\" id:\"704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717\" pid:3413 exited_at:{seconds:1742566237 nanos:342975414}" Mar 21 14:10:37.346205 kubelet[2670]: I0321 14:10:37.346185 2670 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 21 14:10:37.377932 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-704eab2b197d03f97cce01206b55940aa7370b288351c2253d3a60da7bb8f717-rootfs.mount: Deactivated successfully. Mar 21 14:10:37.701680 kubelet[2670]: W0321 14:10:37.561947 2670 reflector.go:561] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-9999-0-3-a-8593155e6d.novalocal" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-9999-0-3-a-8593155e6d.novalocal' and this object Mar 21 14:10:37.701680 kubelet[2670]: E0321 14:10:37.562036 2670 reflector.go:158] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-9999-0-3-a-8593155e6d.novalocal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-9999-0-3-a-8593155e6d.novalocal' and this object" logger="UnhandledError" Mar 21 14:10:37.701680 kubelet[2670]: W0321 14:10:37.562853 2670 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-9999-0-3-a-8593155e6d.novalocal" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-9999-0-3-a-8593155e6d.novalocal' and this object Mar 21 14:10:37.701680 kubelet[2670]: W0321 14:10:37.562986 2670 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-9999-0-3-a-8593155e6d.novalocal" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-9999-0-3-a-8593155e6d.novalocal' and this object Mar 21 14:10:37.564240 systemd[1]: Created slice kubepods-burstable-pod435c7f1f_c4ad_4ef2_b6d4_f14a405e9acb.slice - libcontainer container kubepods-burstable-pod435c7f1f_c4ad_4ef2_b6d4_f14a405e9acb.slice. Mar 21 14:10:37.702896 kubelet[2670]: E0321 14:10:37.563050 2670 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-9999-0-3-a-8593155e6d.novalocal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-9999-0-3-a-8593155e6d.novalocal' and this object" logger="UnhandledError" Mar 21 14:10:37.702896 kubelet[2670]: E0321 14:10:37.563075 2670 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-9999-0-3-a-8593155e6d.novalocal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-9999-0-3-a-8593155e6d.novalocal' and this object" logger="UnhandledError" Mar 21 14:10:37.702896 kubelet[2670]: I0321 14:10:37.618902 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb-config-volume\") pod \"coredns-6f6b679f8f-r67xt\" (UID: \"435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb\") " pod="kube-system/coredns-6f6b679f8f-r67xt" Mar 21 14:10:37.702896 kubelet[2670]: I0321 14:10:37.619044 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpzc\" (UniqueName: \"kubernetes.io/projected/435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb-kube-api-access-bvpzc\") pod \"coredns-6f6b679f8f-r67xt\" (UID: \"435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb\") " pod="kube-system/coredns-6f6b679f8f-r67xt" Mar 21 14:10:37.590530 systemd[1]: Created slice kubepods-besteffort-pod07ea2064_12f5_4627_841e_9de9d478ac15.slice - libcontainer container kubepods-besteffort-pod07ea2064_12f5_4627_841e_9de9d478ac15.slice. Mar 21 14:10:37.598513 systemd[1]: Created slice kubepods-besteffort-pode17ef89e_c55d_492b_9a49_b9627b7fae60.slice - libcontainer container kubepods-besteffort-pode17ef89e_c55d_492b_9a49_b9627b7fae60.slice. Mar 21 14:10:37.607189 systemd[1]: Created slice kubepods-besteffort-pod9b64e4a4_35f0_4d4f_a067_2ba9c47f8c1c.slice - libcontainer container kubepods-besteffort-pod9b64e4a4_35f0_4d4f_a067_2ba9c47f8c1c.slice. Mar 21 14:10:37.612567 systemd[1]: Created slice kubepods-burstable-podc70ee33c_9a91_4431_b069_c6bd0541ec37.slice - libcontainer container kubepods-burstable-podc70ee33c_9a91_4431_b069_c6bd0541ec37.slice. Mar 21 14:10:37.723171 kubelet[2670]: I0321 14:10:37.720365 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfrhk\" (UniqueName: \"kubernetes.io/projected/e17ef89e-c55d-492b-9a49-b9627b7fae60-kube-api-access-sfrhk\") pod \"calico-kube-controllers-86bc8bd8b8-x9w8m\" (UID: \"e17ef89e-c55d-492b-9a49-b9627b7fae60\") " pod="calico-system/calico-kube-controllers-86bc8bd8b8-x9w8m" Mar 21 14:10:37.723171 kubelet[2670]: I0321 14:10:37.720508 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tfg\" (UniqueName: \"kubernetes.io/projected/c70ee33c-9a91-4431-b069-c6bd0541ec37-kube-api-access-s7tfg\") pod \"coredns-6f6b679f8f-c855z\" (UID: \"c70ee33c-9a91-4431-b069-c6bd0541ec37\") " pod="kube-system/coredns-6f6b679f8f-c855z" Mar 21 14:10:37.723171 kubelet[2670]: I0321 14:10:37.720559 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglq9\" (UniqueName: \"kubernetes.io/projected/07ea2064-12f5-4627-841e-9de9d478ac15-kube-api-access-xglq9\") pod \"calico-apiserver-6575f8d7d9-l7xjk\" (UID: \"07ea2064-12f5-4627-841e-9de9d478ac15\") " pod="calico-apiserver/calico-apiserver-6575f8d7d9-l7xjk" Mar 21 14:10:37.723171 kubelet[2670]: I0321 14:10:37.720666 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e17ef89e-c55d-492b-9a49-b9627b7fae60-tigera-ca-bundle\") pod \"calico-kube-controllers-86bc8bd8b8-x9w8m\" (UID: \"e17ef89e-c55d-492b-9a49-b9627b7fae60\") " pod="calico-system/calico-kube-controllers-86bc8bd8b8-x9w8m" Mar 21 14:10:37.723171 kubelet[2670]: I0321 14:10:37.720717 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx7br\" (UniqueName: \"kubernetes.io/projected/9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c-kube-api-access-sx7br\") pod \"calico-apiserver-6575f8d7d9-qrm8s\" (UID: \"9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c\") " pod="calico-apiserver/calico-apiserver-6575f8d7d9-qrm8s" Mar 21 14:10:37.723643 kubelet[2670]: I0321 14:10:37.720760 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c-calico-apiserver-certs\") pod \"calico-apiserver-6575f8d7d9-qrm8s\" (UID: \"9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c\") " pod="calico-apiserver/calico-apiserver-6575f8d7d9-qrm8s" Mar 21 14:10:37.723643 kubelet[2670]: I0321 14:10:37.720828 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/07ea2064-12f5-4627-841e-9de9d478ac15-calico-apiserver-certs\") pod \"calico-apiserver-6575f8d7d9-l7xjk\" (UID: \"07ea2064-12f5-4627-841e-9de9d478ac15\") " pod="calico-apiserver/calico-apiserver-6575f8d7d9-l7xjk" Mar 21 14:10:37.723643 kubelet[2670]: I0321 14:10:37.720876 2670 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c70ee33c-9a91-4431-b069-c6bd0541ec37-config-volume\") pod \"coredns-6f6b679f8f-c855z\" (UID: \"c70ee33c-9a91-4431-b069-c6bd0541ec37\") " pod="kube-system/coredns-6f6b679f8f-c855z" Mar 21 14:10:38.307801 containerd[1478]: time="2025-03-21T14:10:38.306516125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bc8bd8b8-x9w8m,Uid:e17ef89e-c55d-492b-9a49-b9627b7fae60,Namespace:calico-system,Attempt:0,}" Mar 21 14:10:38.391717 containerd[1478]: time="2025-03-21T14:10:38.391675165Z" level=error msg="Failed to destroy network for sandbox \"498346ee2220ea608c9ebcb427da65db355f98b4b5a40ef0b35f6a53f159d620\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:38.394654 containerd[1478]: time="2025-03-21T14:10:38.394557102Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bc8bd8b8-x9w8m,Uid:e17ef89e-c55d-492b-9a49-b9627b7fae60,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"498346ee2220ea608c9ebcb427da65db355f98b4b5a40ef0b35f6a53f159d620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:38.394853 kubelet[2670]: E0321 14:10:38.394779 2670 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"498346ee2220ea608c9ebcb427da65db355f98b4b5a40ef0b35f6a53f159d620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:38.394911 kubelet[2670]: E0321 14:10:38.394874 2670 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"498346ee2220ea608c9ebcb427da65db355f98b4b5a40ef0b35f6a53f159d620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86bc8bd8b8-x9w8m" Mar 21 14:10:38.394911 kubelet[2670]: E0321 14:10:38.394897 2670 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"498346ee2220ea608c9ebcb427da65db355f98b4b5a40ef0b35f6a53f159d620\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86bc8bd8b8-x9w8m" Mar 21 14:10:38.394975 kubelet[2670]: E0321 14:10:38.394939 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86bc8bd8b8-x9w8m_calico-system(e17ef89e-c55d-492b-9a49-b9627b7fae60)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86bc8bd8b8-x9w8m_calico-system(e17ef89e-c55d-492b-9a49-b9627b7fae60)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"498346ee2220ea608c9ebcb427da65db355f98b4b5a40ef0b35f6a53f159d620\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86bc8bd8b8-x9w8m" podUID="e17ef89e-c55d-492b-9a49-b9627b7fae60" Mar 21 14:10:38.395836 systemd[1]: run-netns-cni\x2dc16f114a\x2da05c\x2dba83\x2d3e2f\x2d904974ad6d81.mount: Deactivated successfully. Mar 21 14:10:38.485345 systemd[1]: Created slice kubepods-besteffort-podd174cc7d_f7aa_43cf_981d_eab1c74e7f73.slice - libcontainer container kubepods-besteffort-podd174cc7d_f7aa_43cf_981d_eab1c74e7f73.slice. Mar 21 14:10:38.489528 containerd[1478]: time="2025-03-21T14:10:38.489469605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hjpgs,Uid:d174cc7d-f7aa-43cf-981d-eab1c74e7f73,Namespace:calico-system,Attempt:0,}" Mar 21 14:10:38.567398 containerd[1478]: time="2025-03-21T14:10:38.567256585Z" level=error msg="Failed to destroy network for sandbox \"51fe51f550cf9f91f336c24d6127fcb2176bb922bf035f7e60d6baa9df6e31d3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:38.571907 containerd[1478]: time="2025-03-21T14:10:38.571807752Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hjpgs,Uid:d174cc7d-f7aa-43cf-981d-eab1c74e7f73,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"51fe51f550cf9f91f336c24d6127fcb2176bb922bf035f7e60d6baa9df6e31d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:38.573033 kubelet[2670]: E0321 14:10:38.572135 2670 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51fe51f550cf9f91f336c24d6127fcb2176bb922bf035f7e60d6baa9df6e31d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:38.573033 kubelet[2670]: E0321 14:10:38.572235 2670 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51fe51f550cf9f91f336c24d6127fcb2176bb922bf035f7e60d6baa9df6e31d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hjpgs" Mar 21 14:10:38.573033 kubelet[2670]: E0321 14:10:38.572259 2670 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"51fe51f550cf9f91f336c24d6127fcb2176bb922bf035f7e60d6baa9df6e31d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hjpgs" Mar 21 14:10:38.573235 kubelet[2670]: E0321 14:10:38.572306 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hjpgs_calico-system(d174cc7d-f7aa-43cf-981d-eab1c74e7f73)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hjpgs_calico-system(d174cc7d-f7aa-43cf-981d-eab1c74e7f73)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"51fe51f550cf9f91f336c24d6127fcb2176bb922bf035f7e60d6baa9df6e31d3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hjpgs" podUID="d174cc7d-f7aa-43cf-981d-eab1c74e7f73" Mar 21 14:10:38.573391 systemd[1]: run-netns-cni\x2d7c2276ed\x2dd727\x2dd227\x2de009\x2d635627733bc5.mount: Deactivated successfully. Mar 21 14:10:38.690303 containerd[1478]: time="2025-03-21T14:10:38.689076924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 21 14:10:38.905575 containerd[1478]: time="2025-03-21T14:10:38.905481812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-qrm8s,Uid:9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c,Namespace:calico-apiserver,Attempt:0,}" Mar 21 14:10:38.911222 containerd[1478]: time="2025-03-21T14:10:38.910678065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-l7xjk,Uid:07ea2064-12f5-4627-841e-9de9d478ac15,Namespace:calico-apiserver,Attempt:0,}" Mar 21 14:10:38.912174 containerd[1478]: time="2025-03-21T14:10:38.911588076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-r67xt,Uid:435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb,Namespace:kube-system,Attempt:0,}" Mar 21 14:10:38.915791 containerd[1478]: time="2025-03-21T14:10:38.915268586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-c855z,Uid:c70ee33c-9a91-4431-b069-c6bd0541ec37,Namespace:kube-system,Attempt:0,}" Mar 21 14:10:39.062733 containerd[1478]: time="2025-03-21T14:10:39.062682098Z" level=error msg="Failed to destroy network for sandbox \"101cb15eda15f14811d3f4ac67d8e005a907628f2af341b9143592863ca80cf0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.065233 containerd[1478]: time="2025-03-21T14:10:39.065185078Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-l7xjk,Uid:07ea2064-12f5-4627-841e-9de9d478ac15,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"101cb15eda15f14811d3f4ac67d8e005a907628f2af341b9143592863ca80cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.066000 kubelet[2670]: E0321 14:10:39.065945 2670 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"101cb15eda15f14811d3f4ac67d8e005a907628f2af341b9143592863ca80cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.066914 kubelet[2670]: E0321 14:10:39.066005 2670 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"101cb15eda15f14811d3f4ac67d8e005a907628f2af341b9143592863ca80cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6575f8d7d9-l7xjk" Mar 21 14:10:39.066914 kubelet[2670]: E0321 14:10:39.066029 2670 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"101cb15eda15f14811d3f4ac67d8e005a907628f2af341b9143592863ca80cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6575f8d7d9-l7xjk" Mar 21 14:10:39.066914 kubelet[2670]: E0321 14:10:39.066076 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6575f8d7d9-l7xjk_calico-apiserver(07ea2064-12f5-4627-841e-9de9d478ac15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6575f8d7d9-l7xjk_calico-apiserver(07ea2064-12f5-4627-841e-9de9d478ac15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"101cb15eda15f14811d3f4ac67d8e005a907628f2af341b9143592863ca80cf0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6575f8d7d9-l7xjk" podUID="07ea2064-12f5-4627-841e-9de9d478ac15" Mar 21 14:10:39.078859 containerd[1478]: time="2025-03-21T14:10:39.078803228Z" level=error msg="Failed to destroy network for sandbox \"a00723832d6485d2d62e2a0176525efd9c8761906054a675b55386b02f4bbf7c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.080803 containerd[1478]: time="2025-03-21T14:10:39.080756921Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-c855z,Uid:c70ee33c-9a91-4431-b069-c6bd0541ec37,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a00723832d6485d2d62e2a0176525efd9c8761906054a675b55386b02f4bbf7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.081201 kubelet[2670]: E0321 14:10:39.081168 2670 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a00723832d6485d2d62e2a0176525efd9c8761906054a675b55386b02f4bbf7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.081342 kubelet[2670]: E0321 14:10:39.081324 2670 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a00723832d6485d2d62e2a0176525efd9c8761906054a675b55386b02f4bbf7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-c855z" Mar 21 14:10:39.081456 kubelet[2670]: E0321 14:10:39.081437 2670 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a00723832d6485d2d62e2a0176525efd9c8761906054a675b55386b02f4bbf7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-c855z" Mar 21 14:10:39.081594 kubelet[2670]: E0321 14:10:39.081560 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-c855z_kube-system(c70ee33c-9a91-4431-b069-c6bd0541ec37)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-c855z_kube-system(c70ee33c-9a91-4431-b069-c6bd0541ec37)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a00723832d6485d2d62e2a0176525efd9c8761906054a675b55386b02f4bbf7c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-c855z" podUID="c70ee33c-9a91-4431-b069-c6bd0541ec37" Mar 21 14:10:39.082284 containerd[1478]: time="2025-03-21T14:10:39.082232289Z" level=error msg="Failed to destroy network for sandbox \"bb573a909f4b85f985343dec83607c4fabdb0c00d4dae796edbd858a10c1fc35\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.084332 containerd[1478]: time="2025-03-21T14:10:39.084254891Z" level=error msg="Failed to destroy network for sandbox \"aeb99650bb10eacf717e94ba02155bb0faf44345bf1136c669e80687201696a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.084499 containerd[1478]: time="2025-03-21T14:10:39.084455135Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-r67xt,Uid:435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb573a909f4b85f985343dec83607c4fabdb0c00d4dae796edbd858a10c1fc35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.084838 kubelet[2670]: E0321 14:10:39.084814 2670 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb573a909f4b85f985343dec83607c4fabdb0c00d4dae796edbd858a10c1fc35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.084980 kubelet[2670]: E0321 14:10:39.084958 2670 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb573a909f4b85f985343dec83607c4fabdb0c00d4dae796edbd858a10c1fc35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-r67xt" Mar 21 14:10:39.085079 kubelet[2670]: E0321 14:10:39.085061 2670 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb573a909f4b85f985343dec83607c4fabdb0c00d4dae796edbd858a10c1fc35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-r67xt" Mar 21 14:10:39.085307 kubelet[2670]: E0321 14:10:39.085231 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-r67xt_kube-system(435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-r67xt_kube-system(435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bb573a909f4b85f985343dec83607c4fabdb0c00d4dae796edbd858a10c1fc35\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-r67xt" podUID="435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb" Mar 21 14:10:39.085597 containerd[1478]: time="2025-03-21T14:10:39.085524233Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-qrm8s,Uid:9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aeb99650bb10eacf717e94ba02155bb0faf44345bf1136c669e80687201696a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.085919 kubelet[2670]: E0321 14:10:39.085705 2670 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aeb99650bb10eacf717e94ba02155bb0faf44345bf1136c669e80687201696a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 14:10:39.085919 kubelet[2670]: E0321 14:10:39.085747 2670 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aeb99650bb10eacf717e94ba02155bb0faf44345bf1136c669e80687201696a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6575f8d7d9-qrm8s" Mar 21 14:10:39.085919 kubelet[2670]: E0321 14:10:39.085764 2670 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aeb99650bb10eacf717e94ba02155bb0faf44345bf1136c669e80687201696a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6575f8d7d9-qrm8s" Mar 21 14:10:39.086030 kubelet[2670]: E0321 14:10:39.085796 2670 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6575f8d7d9-qrm8s_calico-apiserver(9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6575f8d7d9-qrm8s_calico-apiserver(9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aeb99650bb10eacf717e94ba02155bb0faf44345bf1136c669e80687201696a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6575f8d7d9-qrm8s" podUID="9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c" Mar 21 14:10:47.035066 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2570869081.mount: Deactivated successfully. Mar 21 14:10:47.080087 containerd[1478]: time="2025-03-21T14:10:47.079973514Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:47.081206 containerd[1478]: time="2025-03-21T14:10:47.081134627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 21 14:10:47.082529 containerd[1478]: time="2025-03-21T14:10:47.082372073Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:47.085063 containerd[1478]: time="2025-03-21T14:10:47.084975557Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:47.085589 containerd[1478]: time="2025-03-21T14:10:47.085493917Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 8.396305655s" Mar 21 14:10:47.085589 containerd[1478]: time="2025-03-21T14:10:47.085543791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 21 14:10:47.106156 containerd[1478]: time="2025-03-21T14:10:47.103677617Z" level=info msg="CreateContainer within sandbox \"a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 21 14:10:47.119992 containerd[1478]: time="2025-03-21T14:10:47.119945961Z" level=info msg="Container 4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:47.139711 containerd[1478]: time="2025-03-21T14:10:47.139547094Z" level=info msg="CreateContainer within sandbox \"a6c44d1275d2c8a9f54a025eec4b45d418e0115fcec4445b55be87811ea008ad\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\"" Mar 21 14:10:47.141200 containerd[1478]: time="2025-03-21T14:10:47.141159422Z" level=info msg="StartContainer for \"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\"" Mar 21 14:10:47.142983 containerd[1478]: time="2025-03-21T14:10:47.142948762Z" level=info msg="connecting to shim 4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169" address="unix:///run/containerd/s/3e567712a613c8f48cf4002c1f837d661f1ae2e820307991d4ceee2cfc671e3a" protocol=ttrpc version=3 Mar 21 14:10:47.169285 systemd[1]: Started cri-containerd-4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169.scope - libcontainer container 4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169. Mar 21 14:10:47.221620 containerd[1478]: time="2025-03-21T14:10:47.221584096Z" level=info msg="StartContainer for \"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" returns successfully" Mar 21 14:10:47.288318 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 21 14:10:47.288624 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 21 14:10:48.987149 kernel: bpftool[3828]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 21 14:10:49.258090 systemd-networkd[1379]: vxlan.calico: Link UP Mar 21 14:10:49.258099 systemd-networkd[1379]: vxlan.calico: Gained carrier Mar 21 14:10:49.480799 containerd[1478]: time="2025-03-21T14:10:49.480719456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-qrm8s,Uid:9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c,Namespace:calico-apiserver,Attempt:0,}" Mar 21 14:10:49.482183 containerd[1478]: time="2025-03-21T14:10:49.481287229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bc8bd8b8-x9w8m,Uid:e17ef89e-c55d-492b-9a49-b9627b7fae60,Namespace:calico-system,Attempt:0,}" Mar 21 14:10:50.360359 systemd-networkd[1379]: calid1a95c753a1: Link UP Mar 21 14:10:50.360834 systemd-networkd[1379]: calid1a95c753a1: Gained carrier Mar 21 14:10:50.372420 kubelet[2670]: I0321 14:10:50.372247 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-pbxct" podStartSLOduration=4.09720702 podStartE2EDuration="28.372227697s" podCreationTimestamp="2025-03-21 14:10:22 +0000 UTC" firstStartedPulling="2025-03-21 14:10:22.811879793 +0000 UTC m=+17.426054870" lastFinishedPulling="2025-03-21 14:10:47.08690046 +0000 UTC m=+41.701075547" observedRunningTime="2025-03-21 14:10:47.754959868 +0000 UTC m=+42.369135075" watchObservedRunningTime="2025-03-21 14:10:50.372227697 +0000 UTC m=+44.986402784" Mar 21 14:10:50.374861 containerd[1478]: 2025-03-21 14:10:50.259 [INFO][3898] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0 calico-apiserver-6575f8d7d9- calico-apiserver 9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c 691 0 2025-03-21 14:10:22 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6575f8d7d9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-3-a-8593155e6d.novalocal calico-apiserver-6575f8d7d9-qrm8s eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid1a95c753a1 [] []}} ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-" Mar 21 14:10:50.374861 containerd[1478]: 2025-03-21 14:10:50.260 [INFO][3898] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" Mar 21 14:10:50.374861 containerd[1478]: 2025-03-21 14:10:50.309 [INFO][3922] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" HandleID="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.321 [INFO][3922] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" HandleID="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bb340), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-3-a-8593155e6d.novalocal", "pod":"calico-apiserver-6575f8d7d9-qrm8s", "timestamp":"2025-03-21 14:10:50.309763196 +0000 UTC"}, Hostname:"ci-9999-0-3-a-8593155e6d.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.321 [INFO][3922] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.321 [INFO][3922] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.321 [INFO][3922] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-a-8593155e6d.novalocal' Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.323 [INFO][3922] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.326 [INFO][3922] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.331 [INFO][3922] ipam/ipam.go 489: Trying affinity for 192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.333 [INFO][3922] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.375057 containerd[1478]: 2025-03-21 14:10:50.335 [INFO][3922] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.376474 containerd[1478]: 2025-03-21 14:10:50.335 [INFO][3922] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.128/26 handle="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.376474 containerd[1478]: 2025-03-21 14:10:50.337 [INFO][3922] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa Mar 21 14:10:50.376474 containerd[1478]: 2025-03-21 14:10:50.341 [INFO][3922] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.128/26 handle="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.376474 containerd[1478]: 2025-03-21 14:10:50.349 [INFO][3922] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.129/26] block=192.168.95.128/26 handle="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.376474 containerd[1478]: 2025-03-21 14:10:50.349 [INFO][3922] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.129/26] handle="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.376474 containerd[1478]: 2025-03-21 14:10:50.349 [INFO][3922] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 14:10:50.376474 containerd[1478]: 2025-03-21 14:10:50.349 [INFO][3922] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.129/26] IPv6=[] ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" HandleID="k8s-pod-network.e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" Mar 21 14:10:50.376639 containerd[1478]: 2025-03-21 14:10:50.353 [INFO][3898] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0", GenerateName:"calico-apiserver-6575f8d7d9-", Namespace:"calico-apiserver", SelfLink:"", UID:"9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6575f8d7d9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"", Pod:"calico-apiserver-6575f8d7d9-qrm8s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid1a95c753a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:50.376718 containerd[1478]: 2025-03-21 14:10:50.353 [INFO][3898] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.129/32] ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" Mar 21 14:10:50.376718 containerd[1478]: 2025-03-21 14:10:50.353 [INFO][3898] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid1a95c753a1 ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" Mar 21 14:10:50.376718 containerd[1478]: 2025-03-21 14:10:50.361 [INFO][3898] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" Mar 21 14:10:50.376794 containerd[1478]: 2025-03-21 14:10:50.362 [INFO][3898] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0", GenerateName:"calico-apiserver-6575f8d7d9-", Namespace:"calico-apiserver", SelfLink:"", UID:"9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6575f8d7d9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa", Pod:"calico-apiserver-6575f8d7d9-qrm8s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid1a95c753a1", MAC:"ee:b1:90:e1:4d:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:50.376862 containerd[1478]: 2025-03-21 14:10:50.373 [INFO][3898] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-qrm8s" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--qrm8s-eth0" Mar 21 14:10:50.456953 containerd[1478]: time="2025-03-21T14:10:50.455886918Z" level=info msg="connecting to shim e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa" address="unix:///run/containerd/s/b57ebba6997417564badcc759112ed765db88ff8da511c56e6ec6d4aa1f20da0" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:50.481178 systemd-networkd[1379]: caliaa600beba15: Link UP Mar 21 14:10:50.484194 containerd[1478]: time="2025-03-21T14:10:50.484144333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hjpgs,Uid:d174cc7d-f7aa-43cf-981d-eab1c74e7f73,Namespace:calico-system,Attempt:0,}" Mar 21 14:10:50.487410 systemd-networkd[1379]: caliaa600beba15: Gained carrier Mar 21 14:10:50.512973 containerd[1478]: 2025-03-21 14:10:50.259 [INFO][3900] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0 calico-kube-controllers-86bc8bd8b8- calico-system e17ef89e-c55d-492b-9a49-b9627b7fae60 688 0 2025-03-21 14:10:22 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:86bc8bd8b8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-0-3-a-8593155e6d.novalocal calico-kube-controllers-86bc8bd8b8-x9w8m eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] caliaa600beba15 [] []}} ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-" Mar 21 14:10:50.512973 containerd[1478]: 2025-03-21 14:10:50.260 [INFO][3900] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" Mar 21 14:10:50.512973 containerd[1478]: 2025-03-21 14:10:50.312 [INFO][3924] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" HandleID="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.322 [INFO][3924] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" HandleID="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00042cb30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-3-a-8593155e6d.novalocal", "pod":"calico-kube-controllers-86bc8bd8b8-x9w8m", "timestamp":"2025-03-21 14:10:50.312311558 +0000 UTC"}, Hostname:"ci-9999-0-3-a-8593155e6d.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.322 [INFO][3924] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.349 [INFO][3924] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.349 [INFO][3924] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-a-8593155e6d.novalocal' Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.425 [INFO][3924] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.430 [INFO][3924] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.435 [INFO][3924] ipam/ipam.go 489: Trying affinity for 192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.438 [INFO][3924] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.513499 containerd[1478]: 2025-03-21 14:10:50.441 [INFO][3924] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.514402 containerd[1478]: 2025-03-21 14:10:50.441 [INFO][3924] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.128/26 handle="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.514402 containerd[1478]: 2025-03-21 14:10:50.443 [INFO][3924] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302 Mar 21 14:10:50.514402 containerd[1478]: 2025-03-21 14:10:50.453 [INFO][3924] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.128/26 handle="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.514402 containerd[1478]: 2025-03-21 14:10:50.470 [INFO][3924] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.130/26] block=192.168.95.128/26 handle="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.514402 containerd[1478]: 2025-03-21 14:10:50.470 [INFO][3924] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.130/26] handle="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.514402 containerd[1478]: 2025-03-21 14:10:50.470 [INFO][3924] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 14:10:50.514402 containerd[1478]: 2025-03-21 14:10:50.470 [INFO][3924] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.130/26] IPv6=[] ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" HandleID="k8s-pod-network.d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" Mar 21 14:10:50.514566 containerd[1478]: 2025-03-21 14:10:50.472 [INFO][3900] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0", GenerateName:"calico-kube-controllers-86bc8bd8b8-", Namespace:"calico-system", SelfLink:"", UID:"e17ef89e-c55d-492b-9a49-b9627b7fae60", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86bc8bd8b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"", Pod:"calico-kube-controllers-86bc8bd8b8-x9w8m", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliaa600beba15", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:50.514636 containerd[1478]: 2025-03-21 14:10:50.472 [INFO][3900] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.130/32] ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" Mar 21 14:10:50.514636 containerd[1478]: 2025-03-21 14:10:50.472 [INFO][3900] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaa600beba15 ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" Mar 21 14:10:50.514636 containerd[1478]: 2025-03-21 14:10:50.489 [INFO][3900] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" Mar 21 14:10:50.514711 containerd[1478]: 2025-03-21 14:10:50.491 [INFO][3900] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0", GenerateName:"calico-kube-controllers-86bc8bd8b8-", Namespace:"calico-system", SelfLink:"", UID:"e17ef89e-c55d-492b-9a49-b9627b7fae60", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86bc8bd8b8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302", Pod:"calico-kube-controllers-86bc8bd8b8-x9w8m", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"caliaa600beba15", MAC:"2e:67:bf:5c:d4:0d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:50.514779 containerd[1478]: 2025-03-21 14:10:50.511 [INFO][3900] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" Namespace="calico-system" Pod="calico-kube-controllers-86bc8bd8b8-x9w8m" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--kube--controllers--86bc8bd8b8--x9w8m-eth0" Mar 21 14:10:50.547371 systemd[1]: Started cri-containerd-e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa.scope - libcontainer container e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa. Mar 21 14:10:50.586814 containerd[1478]: time="2025-03-21T14:10:50.586771743Z" level=info msg="connecting to shim d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302" address="unix:///run/containerd/s/327cdaa72a6e73961965361b74c7e513702edb0fae9d6d0b4317b30a55957625" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:50.630533 systemd[1]: Started cri-containerd-d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302.scope - libcontainer container d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302. Mar 21 14:10:50.656065 containerd[1478]: time="2025-03-21T14:10:50.656034154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-qrm8s,Uid:9b64e4a4-35f0-4d4f-a067-2ba9c47f8c1c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa\"" Mar 21 14:10:50.661036 containerd[1478]: time="2025-03-21T14:10:50.660348393Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 21 14:10:50.724852 systemd-networkd[1379]: cali427b1be9bbb: Link UP Mar 21 14:10:50.725712 systemd-networkd[1379]: cali427b1be9bbb: Gained carrier Mar 21 14:10:50.739035 containerd[1478]: time="2025-03-21T14:10:50.738996990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86bc8bd8b8-x9w8m,Uid:e17ef89e-c55d-492b-9a49-b9627b7fae60,Namespace:calico-system,Attempt:0,} returns sandbox id \"d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302\"" Mar 21 14:10:50.753333 containerd[1478]: 2025-03-21 14:10:50.582 [INFO][3973] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0 csi-node-driver- calico-system d174cc7d-f7aa-43cf-981d-eab1c74e7f73 582 0 2025-03-21 14:10:22 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-9999-0-3-a-8593155e6d.novalocal csi-node-driver-hjpgs eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali427b1be9bbb [] []}} ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-" Mar 21 14:10:50.753333 containerd[1478]: 2025-03-21 14:10:50.583 [INFO][3973] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" Mar 21 14:10:50.753333 containerd[1478]: 2025-03-21 14:10:50.647 [INFO][4035] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" HandleID="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.675 [INFO][4035] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" HandleID="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002907e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-3-a-8593155e6d.novalocal", "pod":"csi-node-driver-hjpgs", "timestamp":"2025-03-21 14:10:50.647440883 +0000 UTC"}, Hostname:"ci-9999-0-3-a-8593155e6d.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.675 [INFO][4035] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.675 [INFO][4035] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.675 [INFO][4035] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-a-8593155e6d.novalocal' Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.677 [INFO][4035] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.682 [INFO][4035] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.688 [INFO][4035] ipam/ipam.go 489: Trying affinity for 192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.690 [INFO][4035] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.753902 containerd[1478]: 2025-03-21 14:10:50.693 [INFO][4035] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.754583 containerd[1478]: 2025-03-21 14:10:50.693 [INFO][4035] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.128/26 handle="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.754583 containerd[1478]: 2025-03-21 14:10:50.694 [INFO][4035] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13 Mar 21 14:10:50.754583 containerd[1478]: 2025-03-21 14:10:50.703 [INFO][4035] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.128/26 handle="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.754583 containerd[1478]: 2025-03-21 14:10:50.716 [INFO][4035] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.131/26] block=192.168.95.128/26 handle="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.754583 containerd[1478]: 2025-03-21 14:10:50.716 [INFO][4035] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.131/26] handle="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:50.754583 containerd[1478]: 2025-03-21 14:10:50.716 [INFO][4035] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 14:10:50.754583 containerd[1478]: 2025-03-21 14:10:50.716 [INFO][4035] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.131/26] IPv6=[] ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" HandleID="k8s-pod-network.14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" Mar 21 14:10:50.755261 containerd[1478]: 2025-03-21 14:10:50.719 [INFO][3973] cni-plugin/k8s.go 386: Populated endpoint ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d174cc7d-f7aa-43cf-981d-eab1c74e7f73", ResourceVersion:"582", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"", Pod:"csi-node-driver-hjpgs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali427b1be9bbb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:50.755394 containerd[1478]: 2025-03-21 14:10:50.720 [INFO][3973] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.131/32] ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" Mar 21 14:10:50.755394 containerd[1478]: 2025-03-21 14:10:50.720 [INFO][3973] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali427b1be9bbb ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" Mar 21 14:10:50.755394 containerd[1478]: 2025-03-21 14:10:50.728 [INFO][3973] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" Mar 21 14:10:50.755671 containerd[1478]: 2025-03-21 14:10:50.729 [INFO][3973] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d174cc7d-f7aa-43cf-981d-eab1c74e7f73", ResourceVersion:"582", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13", Pod:"csi-node-driver-hjpgs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali427b1be9bbb", MAC:"ce:8d:f8:b5:57:36", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:50.755882 containerd[1478]: 2025-03-21 14:10:50.749 [INFO][3973] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" Namespace="calico-system" Pod="csi-node-driver-hjpgs" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-csi--node--driver--hjpgs-eth0" Mar 21 14:10:50.793749 containerd[1478]: time="2025-03-21T14:10:50.793697773Z" level=info msg="connecting to shim 14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13" address="unix:///run/containerd/s/20a656d27e3ab959dba4ee84fdf8fe07aa9f236578b67de912c4ebfaf89ab72b" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:50.820254 systemd[1]: Started cri-containerd-14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13.scope - libcontainer container 14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13. Mar 21 14:10:50.847428 containerd[1478]: time="2025-03-21T14:10:50.847379749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hjpgs,Uid:d174cc7d-f7aa-43cf-981d-eab1c74e7f73,Namespace:calico-system,Attempt:0,} returns sandbox id \"14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13\"" Mar 21 14:10:50.993621 systemd-networkd[1379]: vxlan.calico: Gained IPv6LL Mar 21 14:10:51.256329 kubelet[2670]: I0321 14:10:51.255560 2670 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 14:10:51.386668 containerd[1478]: time="2025-03-21T14:10:51.386360826Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" id:\"676a39dd090798bf31604a0b6e706cc5df480a65f47da17f8b8c80e4ad76d80c\" pid:4143 exited_at:{seconds:1742566251 nanos:384616519}" Mar 21 14:10:51.473552 containerd[1478]: time="2025-03-21T14:10:51.473446418Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" id:\"8ad801e5a967fecb3eb5105333f876ac34e121fa7a31565969da732c9f587cc1\" pid:4169 exited_at:{seconds:1742566251 nanos:473026552}" Mar 21 14:10:51.479179 containerd[1478]: time="2025-03-21T14:10:51.479076031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-l7xjk,Uid:07ea2064-12f5-4627-841e-9de9d478ac15,Namespace:calico-apiserver,Attempt:0,}" Mar 21 14:10:51.602785 systemd-networkd[1379]: cali6d569bf6ba4: Link UP Mar 21 14:10:51.604155 systemd-networkd[1379]: cali6d569bf6ba4: Gained carrier Mar 21 14:10:51.621058 containerd[1478]: 2025-03-21 14:10:51.520 [INFO][4181] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0 calico-apiserver-6575f8d7d9- calico-apiserver 07ea2064-12f5-4627-841e-9de9d478ac15 693 0 2025-03-21 14:10:22 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6575f8d7d9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-3-a-8593155e6d.novalocal calico-apiserver-6575f8d7d9-l7xjk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6d569bf6ba4 [] []}} ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-" Mar 21 14:10:51.621058 containerd[1478]: 2025-03-21 14:10:51.520 [INFO][4181] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" Mar 21 14:10:51.621058 containerd[1478]: 2025-03-21 14:10:51.552 [INFO][4194] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" HandleID="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.567 [INFO][4194] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" HandleID="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031d390), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-3-a-8593155e6d.novalocal", "pod":"calico-apiserver-6575f8d7d9-l7xjk", "timestamp":"2025-03-21 14:10:51.552507492 +0000 UTC"}, Hostname:"ci-9999-0-3-a-8593155e6d.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.567 [INFO][4194] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.567 [INFO][4194] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.567 [INFO][4194] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-a-8593155e6d.novalocal' Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.570 [INFO][4194] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.574 [INFO][4194] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.579 [INFO][4194] ipam/ipam.go 489: Trying affinity for 192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.581 [INFO][4194] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621556 containerd[1478]: 2025-03-21 14:10:51.584 [INFO][4194] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621810 containerd[1478]: 2025-03-21 14:10:51.584 [INFO][4194] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.128/26 handle="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621810 containerd[1478]: 2025-03-21 14:10:51.585 [INFO][4194] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9 Mar 21 14:10:51.621810 containerd[1478]: 2025-03-21 14:10:51.589 [INFO][4194] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.128/26 handle="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621810 containerd[1478]: 2025-03-21 14:10:51.597 [INFO][4194] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.132/26] block=192.168.95.128/26 handle="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621810 containerd[1478]: 2025-03-21 14:10:51.597 [INFO][4194] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.132/26] handle="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:51.621810 containerd[1478]: 2025-03-21 14:10:51.597 [INFO][4194] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 14:10:51.621810 containerd[1478]: 2025-03-21 14:10:51.598 [INFO][4194] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.132/26] IPv6=[] ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" HandleID="k8s-pod-network.587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" Mar 21 14:10:51.621984 containerd[1478]: 2025-03-21 14:10:51.600 [INFO][4181] cni-plugin/k8s.go 386: Populated endpoint ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0", GenerateName:"calico-apiserver-6575f8d7d9-", Namespace:"calico-apiserver", SelfLink:"", UID:"07ea2064-12f5-4627-841e-9de9d478ac15", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6575f8d7d9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"", Pod:"calico-apiserver-6575f8d7d9-l7xjk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6d569bf6ba4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:51.622058 containerd[1478]: 2025-03-21 14:10:51.600 [INFO][4181] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.132/32] ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" Mar 21 14:10:51.622058 containerd[1478]: 2025-03-21 14:10:51.600 [INFO][4181] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6d569bf6ba4 ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" Mar 21 14:10:51.622058 containerd[1478]: 2025-03-21 14:10:51.603 [INFO][4181] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" Mar 21 14:10:51.622173 containerd[1478]: 2025-03-21 14:10:51.604 [INFO][4181] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0", GenerateName:"calico-apiserver-6575f8d7d9-", Namespace:"calico-apiserver", SelfLink:"", UID:"07ea2064-12f5-4627-841e-9de9d478ac15", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6575f8d7d9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9", Pod:"calico-apiserver-6575f8d7d9-l7xjk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6d569bf6ba4", MAC:"d6:ac:0d:60:21:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:51.622248 containerd[1478]: 2025-03-21 14:10:51.619 [INFO][4181] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" Namespace="calico-apiserver" Pod="calico-apiserver-6575f8d7d9-l7xjk" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-calico--apiserver--6575f8d7d9--l7xjk-eth0" Mar 21 14:10:51.658649 containerd[1478]: time="2025-03-21T14:10:51.657702370Z" level=info msg="connecting to shim 587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9" address="unix:///run/containerd/s/30d3173048f3c15687e7642ac77b2bd11d8352dd008d498cb7ac9299de376e10" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:51.681255 systemd[1]: Started cri-containerd-587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9.scope - libcontainer container 587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9. Mar 21 14:10:51.739750 containerd[1478]: time="2025-03-21T14:10:51.739713820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6575f8d7d9-l7xjk,Uid:07ea2064-12f5-4627-841e-9de9d478ac15,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9\"" Mar 21 14:10:52.271401 systemd-networkd[1379]: calid1a95c753a1: Gained IPv6LL Mar 21 14:10:52.463305 systemd-networkd[1379]: caliaa600beba15: Gained IPv6LL Mar 21 14:10:52.490856 containerd[1478]: time="2025-03-21T14:10:52.490779864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-r67xt,Uid:435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb,Namespace:kube-system,Attempt:0,}" Mar 21 14:10:52.654121 systemd-networkd[1379]: calie8e2884c6dc: Link UP Mar 21 14:10:52.655520 systemd-networkd[1379]: calie8e2884c6dc: Gained carrier Mar 21 14:10:52.655978 systemd-networkd[1379]: cali427b1be9bbb: Gained IPv6LL Mar 21 14:10:52.681356 containerd[1478]: 2025-03-21 14:10:52.555 [INFO][4257] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0 coredns-6f6b679f8f- kube-system 435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb 684 0 2025-03-21 14:10:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-3-a-8593155e6d.novalocal coredns-6f6b679f8f-r67xt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie8e2884c6dc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-" Mar 21 14:10:52.681356 containerd[1478]: 2025-03-21 14:10:52.555 [INFO][4257] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" Mar 21 14:10:52.681356 containerd[1478]: 2025-03-21 14:10:52.589 [INFO][4270] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" HandleID="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.599 [INFO][4270] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" HandleID="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290b40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-3-a-8593155e6d.novalocal", "pod":"coredns-6f6b679f8f-r67xt", "timestamp":"2025-03-21 14:10:52.589474809 +0000 UTC"}, Hostname:"ci-9999-0-3-a-8593155e6d.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.599 [INFO][4270] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.599 [INFO][4270] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.599 [INFO][4270] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-a-8593155e6d.novalocal' Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.602 [INFO][4270] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.607 [INFO][4270] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.612 [INFO][4270] ipam/ipam.go 489: Trying affinity for 192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.614 [INFO][4270] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682491 containerd[1478]: 2025-03-21 14:10:52.617 [INFO][4270] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682777 containerd[1478]: 2025-03-21 14:10:52.617 [INFO][4270] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.128/26 handle="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682777 containerd[1478]: 2025-03-21 14:10:52.619 [INFO][4270] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9 Mar 21 14:10:52.682777 containerd[1478]: 2025-03-21 14:10:52.633 [INFO][4270] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.128/26 handle="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682777 containerd[1478]: 2025-03-21 14:10:52.642 [INFO][4270] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.133/26] block=192.168.95.128/26 handle="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682777 containerd[1478]: 2025-03-21 14:10:52.643 [INFO][4270] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.133/26] handle="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:52.682777 containerd[1478]: 2025-03-21 14:10:52.643 [INFO][4270] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 14:10:52.682777 containerd[1478]: 2025-03-21 14:10:52.643 [INFO][4270] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.133/26] IPv6=[] ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" HandleID="k8s-pod-network.4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" Mar 21 14:10:52.682981 containerd[1478]: 2025-03-21 14:10:52.646 [INFO][4257] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"", Pod:"coredns-6f6b679f8f-r67xt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie8e2884c6dc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:52.682981 containerd[1478]: 2025-03-21 14:10:52.648 [INFO][4257] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.133/32] ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" Mar 21 14:10:52.682981 containerd[1478]: 2025-03-21 14:10:52.649 [INFO][4257] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8e2884c6dc ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" Mar 21 14:10:52.682981 containerd[1478]: 2025-03-21 14:10:52.656 [INFO][4257] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" Mar 21 14:10:52.682981 containerd[1478]: 2025-03-21 14:10:52.656 [INFO][4257] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9", Pod:"coredns-6f6b679f8f-r67xt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie8e2884c6dc", MAC:"9e:19:6a:42:49:07", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:52.682981 containerd[1478]: 2025-03-21 14:10:52.676 [INFO][4257] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" Namespace="kube-system" Pod="coredns-6f6b679f8f-r67xt" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--r67xt-eth0" Mar 21 14:10:52.762023 containerd[1478]: time="2025-03-21T14:10:52.761983473Z" level=info msg="connecting to shim 4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9" address="unix:///run/containerd/s/8dd265d50a4707fd820e9343d458c2d1f56e9ee5d43956a02634b23f91bec0ba" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:52.808320 systemd[1]: Started cri-containerd-4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9.scope - libcontainer container 4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9. Mar 21 14:10:52.882544 containerd[1478]: time="2025-03-21T14:10:52.882424249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-r67xt,Uid:435c7f1f-c4ad-4ef2-b6d4-f14a405e9acb,Namespace:kube-system,Attempt:0,} returns sandbox id \"4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9\"" Mar 21 14:10:52.888235 containerd[1478]: time="2025-03-21T14:10:52.887392656Z" level=info msg="CreateContainer within sandbox \"4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 21 14:10:52.906324 containerd[1478]: time="2025-03-21T14:10:52.906220319Z" level=info msg="Container d51fe8a943e718f30e148de3ff6686cd1fcf1a57f79347dc6caec6347b5be6cd: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:52.937184 containerd[1478]: time="2025-03-21T14:10:52.935788577Z" level=info msg="CreateContainer within sandbox \"4ac5ddae9a646597b42e81d936c35f514ae21d50bc345d5e63f4adcf802828f9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d51fe8a943e718f30e148de3ff6686cd1fcf1a57f79347dc6caec6347b5be6cd\"" Mar 21 14:10:52.937309 containerd[1478]: time="2025-03-21T14:10:52.937259563Z" level=info msg="StartContainer for \"d51fe8a943e718f30e148de3ff6686cd1fcf1a57f79347dc6caec6347b5be6cd\"" Mar 21 14:10:52.938614 containerd[1478]: time="2025-03-21T14:10:52.938581321Z" level=info msg="connecting to shim d51fe8a943e718f30e148de3ff6686cd1fcf1a57f79347dc6caec6347b5be6cd" address="unix:///run/containerd/s/8dd265d50a4707fd820e9343d458c2d1f56e9ee5d43956a02634b23f91bec0ba" protocol=ttrpc version=3 Mar 21 14:10:52.970347 systemd[1]: Started cri-containerd-d51fe8a943e718f30e148de3ff6686cd1fcf1a57f79347dc6caec6347b5be6cd.scope - libcontainer container d51fe8a943e718f30e148de3ff6686cd1fcf1a57f79347dc6caec6347b5be6cd. Mar 21 14:10:53.045142 containerd[1478]: time="2025-03-21T14:10:53.044402850Z" level=info msg="StartContainer for \"d51fe8a943e718f30e148de3ff6686cd1fcf1a57f79347dc6caec6347b5be6cd\" returns successfully" Mar 21 14:10:53.231321 systemd-networkd[1379]: cali6d569bf6ba4: Gained IPv6LL Mar 21 14:10:53.768460 kubelet[2670]: I0321 14:10:53.768335 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-r67xt" podStartSLOduration=43.768316828 podStartE2EDuration="43.768316828s" podCreationTimestamp="2025-03-21 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 14:10:53.766764399 +0000 UTC m=+48.380939506" watchObservedRunningTime="2025-03-21 14:10:53.768316828 +0000 UTC m=+48.382491905" Mar 21 14:10:53.935403 systemd-networkd[1379]: calie8e2884c6dc: Gained IPv6LL Mar 21 14:10:54.479597 containerd[1478]: time="2025-03-21T14:10:54.479329937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-c855z,Uid:c70ee33c-9a91-4431-b069-c6bd0541ec37,Namespace:kube-system,Attempt:0,}" Mar 21 14:10:54.731832 systemd-networkd[1379]: cali66fff6cefb5: Link UP Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.545 [INFO][4374] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0 coredns-6f6b679f8f- kube-system c70ee33c-9a91-4431-b069-c6bd0541ec37 692 0 2025-03-21 14:10:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-3-a-8593155e6d.novalocal coredns-6f6b679f8f-c855z eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali66fff6cefb5 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.545 [INFO][4374] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.580 [INFO][4388] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" HandleID="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.596 [INFO][4388] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" HandleID="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031d740), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-3-a-8593155e6d.novalocal", "pod":"coredns-6f6b679f8f-c855z", "timestamp":"2025-03-21 14:10:54.580897788 +0000 UTC"}, Hostname:"ci-9999-0-3-a-8593155e6d.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.596 [INFO][4388] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.596 [INFO][4388] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.596 [INFO][4388] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-a-8593155e6d.novalocal' Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.599 [INFO][4388] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.604 [INFO][4388] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.609 [INFO][4388] ipam/ipam.go 489: Trying affinity for 192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.611 [INFO][4388] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.614 [INFO][4388] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.128/26 host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.614 [INFO][4388] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.128/26 handle="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.620 [INFO][4388] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775 Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.632 [INFO][4388] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.128/26 handle="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.709 [INFO][4388] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.134/26] block=192.168.95.128/26 handle="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.723 [INFO][4388] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.134/26] handle="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" host="ci-9999-0-3-a-8593155e6d.novalocal" Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.723 [INFO][4388] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 14:10:54.820342 containerd[1478]: 2025-03-21 14:10:54.724 [INFO][4388] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.134/26] IPv6=[] ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" HandleID="k8s-pod-network.f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Workload="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" Mar 21 14:10:54.732034 systemd-networkd[1379]: cali66fff6cefb5: Gained carrier Mar 21 14:10:54.822067 containerd[1478]: 2025-03-21 14:10:54.727 [INFO][4374] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"c70ee33c-9a91-4431-b069-c6bd0541ec37", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"", Pod:"coredns-6f6b679f8f-c855z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66fff6cefb5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:54.822067 containerd[1478]: 2025-03-21 14:10:54.727 [INFO][4374] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.134/32] ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" Mar 21 14:10:54.822067 containerd[1478]: 2025-03-21 14:10:54.727 [INFO][4374] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66fff6cefb5 ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" Mar 21 14:10:54.822067 containerd[1478]: 2025-03-21 14:10:54.732 [INFO][4374] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" Mar 21 14:10:54.822067 containerd[1478]: 2025-03-21 14:10:54.732 [INFO][4374] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"c70ee33c-9a91-4431-b069-c6bd0541ec37", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 14, 10, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-a-8593155e6d.novalocal", ContainerID:"f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775", Pod:"coredns-6f6b679f8f-c855z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66fff6cefb5", MAC:"3e:ee:87:0d:54:84", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 14:10:54.822067 containerd[1478]: 2025-03-21 14:10:54.792 [INFO][4374] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" Namespace="kube-system" Pod="coredns-6f6b679f8f-c855z" WorkloadEndpoint="ci--9999--0--3--a--8593155e6d.novalocal-k8s-coredns--6f6b679f8f--c855z-eth0" Mar 21 14:10:55.176345 containerd[1478]: time="2025-03-21T14:10:55.176096252Z" level=info msg="connecting to shim f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775" address="unix:///run/containerd/s/c8f1a7dd6248d2952ebfeff9eb71a70490e27d96df585cdf13dcbc93e5b22524" namespace=k8s.io protocol=ttrpc version=3 Mar 21 14:10:55.214319 systemd[1]: Started cri-containerd-f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775.scope - libcontainer container f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775. Mar 21 14:10:55.303282 containerd[1478]: time="2025-03-21T14:10:55.302906654Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-c855z,Uid:c70ee33c-9a91-4431-b069-c6bd0541ec37,Namespace:kube-system,Attempt:0,} returns sandbox id \"f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775\"" Mar 21 14:10:55.307452 containerd[1478]: time="2025-03-21T14:10:55.307416319Z" level=info msg="CreateContainer within sandbox \"f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 21 14:10:55.327354 containerd[1478]: time="2025-03-21T14:10:55.327319849Z" level=info msg="Container 311aec8f56790e4258a1e570d8d4ee9d1f92fc7dd9fe98712ba41c1d2a4af5d2: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:55.349875 containerd[1478]: time="2025-03-21T14:10:55.349841920Z" level=info msg="CreateContainer within sandbox \"f54d4e002e700425a84d3acabdf973357b80ae63fd2b1b284cb2c8e413780775\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"311aec8f56790e4258a1e570d8d4ee9d1f92fc7dd9fe98712ba41c1d2a4af5d2\"" Mar 21 14:10:55.354094 containerd[1478]: time="2025-03-21T14:10:55.354069389Z" level=info msg="StartContainer for \"311aec8f56790e4258a1e570d8d4ee9d1f92fc7dd9fe98712ba41c1d2a4af5d2\"" Mar 21 14:10:55.355618 containerd[1478]: time="2025-03-21T14:10:55.355583544Z" level=info msg="connecting to shim 311aec8f56790e4258a1e570d8d4ee9d1f92fc7dd9fe98712ba41c1d2a4af5d2" address="unix:///run/containerd/s/c8f1a7dd6248d2952ebfeff9eb71a70490e27d96df585cdf13dcbc93e5b22524" protocol=ttrpc version=3 Mar 21 14:10:55.405741 systemd[1]: Started cri-containerd-311aec8f56790e4258a1e570d8d4ee9d1f92fc7dd9fe98712ba41c1d2a4af5d2.scope - libcontainer container 311aec8f56790e4258a1e570d8d4ee9d1f92fc7dd9fe98712ba41c1d2a4af5d2. Mar 21 14:10:55.509238 containerd[1478]: time="2025-03-21T14:10:55.508970340Z" level=info msg="StartContainer for \"311aec8f56790e4258a1e570d8d4ee9d1f92fc7dd9fe98712ba41c1d2a4af5d2\" returns successfully" Mar 21 14:10:55.802460 kubelet[2670]: I0321 14:10:55.802150 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-c855z" podStartSLOduration=45.802134256 podStartE2EDuration="45.802134256s" podCreationTimestamp="2025-03-21 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 14:10:55.769544623 +0000 UTC m=+50.383719710" watchObservedRunningTime="2025-03-21 14:10:55.802134256 +0000 UTC m=+50.416309343" Mar 21 14:10:55.919990 systemd-networkd[1379]: cali66fff6cefb5: Gained IPv6LL Mar 21 14:10:56.453549 containerd[1478]: time="2025-03-21T14:10:56.453490948Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:56.454601 containerd[1478]: time="2025-03-21T14:10:56.454546786Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 21 14:10:56.456027 containerd[1478]: time="2025-03-21T14:10:56.455978894Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:56.458911 containerd[1478]: time="2025-03-21T14:10:56.458871298Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:10:56.459885 containerd[1478]: time="2025-03-21T14:10:56.459750566Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 5.799369672s" Mar 21 14:10:56.459885 containerd[1478]: time="2025-03-21T14:10:56.459784307Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 21 14:10:56.479068 containerd[1478]: time="2025-03-21T14:10:56.478938215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 21 14:10:56.496336 containerd[1478]: time="2025-03-21T14:10:56.496294628Z" level=info msg="CreateContainer within sandbox \"e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 21 14:10:56.511133 containerd[1478]: time="2025-03-21T14:10:56.509612332Z" level=info msg="Container 12598413f69f50713264eb94f32d398d36172b2341fa12c37cf2bd339e5901af: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:10:56.523440 containerd[1478]: time="2025-03-21T14:10:56.523367245Z" level=info msg="CreateContainer within sandbox \"e34897d2ab8bbec7d9721bdffc8d88ebec3353692f033078ba6a93dccb9cf1fa\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"12598413f69f50713264eb94f32d398d36172b2341fa12c37cf2bd339e5901af\"" Mar 21 14:10:56.526326 containerd[1478]: time="2025-03-21T14:10:56.524905385Z" level=info msg="StartContainer for \"12598413f69f50713264eb94f32d398d36172b2341fa12c37cf2bd339e5901af\"" Mar 21 14:10:56.526471 containerd[1478]: time="2025-03-21T14:10:56.526106390Z" level=info msg="connecting to shim 12598413f69f50713264eb94f32d398d36172b2341fa12c37cf2bd339e5901af" address="unix:///run/containerd/s/b57ebba6997417564badcc759112ed765db88ff8da511c56e6ec6d4aa1f20da0" protocol=ttrpc version=3 Mar 21 14:10:56.555266 systemd[1]: Started cri-containerd-12598413f69f50713264eb94f32d398d36172b2341fa12c37cf2bd339e5901af.scope - libcontainer container 12598413f69f50713264eb94f32d398d36172b2341fa12c37cf2bd339e5901af. Mar 21 14:10:56.636484 containerd[1478]: time="2025-03-21T14:10:56.636446853Z" level=info msg="StartContainer for \"12598413f69f50713264eb94f32d398d36172b2341fa12c37cf2bd339e5901af\" returns successfully" Mar 21 14:10:56.786387 kubelet[2670]: I0321 14:10:56.786255 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6575f8d7d9-qrm8s" podStartSLOduration=28.966040982 podStartE2EDuration="34.786237199s" podCreationTimestamp="2025-03-21 14:10:22 +0000 UTC" firstStartedPulling="2025-03-21 14:10:50.658588707 +0000 UTC m=+45.272763784" lastFinishedPulling="2025-03-21 14:10:56.478784914 +0000 UTC m=+51.092960001" observedRunningTime="2025-03-21 14:10:56.783685787 +0000 UTC m=+51.397860865" watchObservedRunningTime="2025-03-21 14:10:56.786237199 +0000 UTC m=+51.400412276" Mar 21 14:10:57.766009 kubelet[2670]: I0321 14:10:57.765768 2670 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 14:11:00.578217 containerd[1478]: time="2025-03-21T14:11:00.577456657Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:00.579477 containerd[1478]: time="2025-03-21T14:11:00.579393616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 21 14:11:00.581409 containerd[1478]: time="2025-03-21T14:11:00.580563098Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:00.583282 containerd[1478]: time="2025-03-21T14:11:00.583259368Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:00.585281 containerd[1478]: time="2025-03-21T14:11:00.585253232Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 4.106225844s" Mar 21 14:11:00.585394 containerd[1478]: time="2025-03-21T14:11:00.585375827Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 21 14:11:00.587084 containerd[1478]: time="2025-03-21T14:11:00.587062877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 21 14:11:00.602054 containerd[1478]: time="2025-03-21T14:11:00.602016818Z" level=info msg="CreateContainer within sandbox \"d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 21 14:11:00.619102 containerd[1478]: time="2025-03-21T14:11:00.619056608Z" level=info msg="Container ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:11:00.634542 containerd[1478]: time="2025-03-21T14:11:00.633589156Z" level=info msg="CreateContainer within sandbox \"d81cfda661b3ab862e0cfe0a7af748ba766e702356312424b483e6bffc035302\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\"" Mar 21 14:11:00.636662 containerd[1478]: time="2025-03-21T14:11:00.636630779Z" level=info msg="StartContainer for \"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\"" Mar 21 14:11:00.638519 containerd[1478]: time="2025-03-21T14:11:00.638481740Z" level=info msg="connecting to shim ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93" address="unix:///run/containerd/s/327cdaa72a6e73961965361b74c7e513702edb0fae9d6d0b4317b30a55957625" protocol=ttrpc version=3 Mar 21 14:11:00.664264 systemd[1]: Started cri-containerd-ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93.scope - libcontainer container ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93. Mar 21 14:11:00.729519 containerd[1478]: time="2025-03-21T14:11:00.729477468Z" level=info msg="StartContainer for \"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" returns successfully" Mar 21 14:11:00.911440 containerd[1478]: time="2025-03-21T14:11:00.911387685Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"6ee0cbc312666d1420ba0fbe7a4fb685e4cc50ae3eb546a10af8ba5219c433ae\" pid:4601 exited_at:{seconds:1742566260 nanos:910970310}" Mar 21 14:11:00.938141 kubelet[2670]: I0321 14:11:00.936589 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-86bc8bd8b8-x9w8m" podStartSLOduration=29.091078517 podStartE2EDuration="38.936574484s" podCreationTimestamp="2025-03-21 14:10:22 +0000 UTC" firstStartedPulling="2025-03-21 14:10:50.740781823 +0000 UTC m=+45.354956910" lastFinishedPulling="2025-03-21 14:11:00.58627779 +0000 UTC m=+55.200452877" observedRunningTime="2025-03-21 14:11:00.795159053 +0000 UTC m=+55.409334161" watchObservedRunningTime="2025-03-21 14:11:00.936574484 +0000 UTC m=+55.550749561" Mar 21 14:11:02.756029 containerd[1478]: time="2025-03-21T14:11:02.755968519Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:02.758097 containerd[1478]: time="2025-03-21T14:11:02.757973366Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 21 14:11:02.760310 containerd[1478]: time="2025-03-21T14:11:02.759484208Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:02.762131 containerd[1478]: time="2025-03-21T14:11:02.762093685Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:02.763682 containerd[1478]: time="2025-03-21T14:11:02.763659839Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.176227153s" Mar 21 14:11:02.763789 containerd[1478]: time="2025-03-21T14:11:02.763772635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 21 14:11:02.764715 containerd[1478]: time="2025-03-21T14:11:02.764696240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 21 14:11:02.766983 containerd[1478]: time="2025-03-21T14:11:02.766958561Z" level=info msg="CreateContainer within sandbox \"14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 21 14:11:02.783898 containerd[1478]: time="2025-03-21T14:11:02.783861400Z" level=info msg="Container 32b670b18960a6f84e0d5400b0abefd8a1f04ef42b02395ec88db12174b729f2: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:11:02.799516 containerd[1478]: time="2025-03-21T14:11:02.799368869Z" level=info msg="CreateContainer within sandbox \"14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"32b670b18960a6f84e0d5400b0abefd8a1f04ef42b02395ec88db12174b729f2\"" Mar 21 14:11:02.800654 containerd[1478]: time="2025-03-21T14:11:02.800611228Z" level=info msg="StartContainer for \"32b670b18960a6f84e0d5400b0abefd8a1f04ef42b02395ec88db12174b729f2\"" Mar 21 14:11:02.802355 containerd[1478]: time="2025-03-21T14:11:02.802302260Z" level=info msg="connecting to shim 32b670b18960a6f84e0d5400b0abefd8a1f04ef42b02395ec88db12174b729f2" address="unix:///run/containerd/s/20a656d27e3ab959dba4ee84fdf8fe07aa9f236578b67de912c4ebfaf89ab72b" protocol=ttrpc version=3 Mar 21 14:11:02.832288 systemd[1]: Started cri-containerd-32b670b18960a6f84e0d5400b0abefd8a1f04ef42b02395ec88db12174b729f2.scope - libcontainer container 32b670b18960a6f84e0d5400b0abefd8a1f04ef42b02395ec88db12174b729f2. Mar 21 14:11:02.878690 containerd[1478]: time="2025-03-21T14:11:02.878176687Z" level=info msg="StartContainer for \"32b670b18960a6f84e0d5400b0abefd8a1f04ef42b02395ec88db12174b729f2\" returns successfully" Mar 21 14:11:03.235578 containerd[1478]: time="2025-03-21T14:11:03.235500255Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:03.237875 containerd[1478]: time="2025-03-21T14:11:03.237788956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 21 14:11:03.240171 containerd[1478]: time="2025-03-21T14:11:03.239957908Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 475.131108ms" Mar 21 14:11:03.240171 containerd[1478]: time="2025-03-21T14:11:03.240012238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 21 14:11:03.242439 containerd[1478]: time="2025-03-21T14:11:03.242389111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 21 14:11:03.243605 containerd[1478]: time="2025-03-21T14:11:03.243544602Z" level=info msg="CreateContainer within sandbox \"587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 21 14:11:03.301270 containerd[1478]: time="2025-03-21T14:11:03.301232747Z" level=info msg="Container c62177abf4b10305006dbe47dfde4702da365b9ef29e12be7bd6cd88e37618a7: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:11:03.316732 containerd[1478]: time="2025-03-21T14:11:03.316695972Z" level=info msg="CreateContainer within sandbox \"587c26ead7cb2c7d68310c517e86620c842b7c38e713bd91fdcd2330056d61b9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c62177abf4b10305006dbe47dfde4702da365b9ef29e12be7bd6cd88e37618a7\"" Mar 21 14:11:03.317575 containerd[1478]: time="2025-03-21T14:11:03.317552945Z" level=info msg="StartContainer for \"c62177abf4b10305006dbe47dfde4702da365b9ef29e12be7bd6cd88e37618a7\"" Mar 21 14:11:03.318893 containerd[1478]: time="2025-03-21T14:11:03.318869993Z" level=info msg="connecting to shim c62177abf4b10305006dbe47dfde4702da365b9ef29e12be7bd6cd88e37618a7" address="unix:///run/containerd/s/30d3173048f3c15687e7642ac77b2bd11d8352dd008d498cb7ac9299de376e10" protocol=ttrpc version=3 Mar 21 14:11:03.343266 systemd[1]: Started cri-containerd-c62177abf4b10305006dbe47dfde4702da365b9ef29e12be7bd6cd88e37618a7.scope - libcontainer container c62177abf4b10305006dbe47dfde4702da365b9ef29e12be7bd6cd88e37618a7. Mar 21 14:11:03.403435 containerd[1478]: time="2025-03-21T14:11:03.402926151Z" level=info msg="StartContainer for \"c62177abf4b10305006dbe47dfde4702da365b9ef29e12be7bd6cd88e37618a7\" returns successfully" Mar 21 14:11:03.825698 kubelet[2670]: I0321 14:11:03.825618 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6575f8d7d9-l7xjk" podStartSLOduration=30.325259573 podStartE2EDuration="41.825583394s" podCreationTimestamp="2025-03-21 14:10:22 +0000 UTC" firstStartedPulling="2025-03-21 14:10:51.740910069 +0000 UTC m=+46.355085156" lastFinishedPulling="2025-03-21 14:11:03.24123389 +0000 UTC m=+57.855408977" observedRunningTime="2025-03-21 14:11:03.823251414 +0000 UTC m=+58.437426541" watchObservedRunningTime="2025-03-21 14:11:03.825583394 +0000 UTC m=+58.439758511" Mar 21 14:11:04.806608 kubelet[2670]: I0321 14:11:04.806558 2670 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 14:11:05.609144 containerd[1478]: time="2025-03-21T14:11:05.609072451Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:05.610932 containerd[1478]: time="2025-03-21T14:11:05.610884432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 21 14:11:05.612550 containerd[1478]: time="2025-03-21T14:11:05.612516381Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:05.615725 containerd[1478]: time="2025-03-21T14:11:05.615691756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 14:11:05.617053 containerd[1478]: time="2025-03-21T14:11:05.617019396Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.374593338s" Mar 21 14:11:05.617104 containerd[1478]: time="2025-03-21T14:11:05.617052067Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 21 14:11:05.619572 containerd[1478]: time="2025-03-21T14:11:05.619536933Z" level=info msg="CreateContainer within sandbox \"14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 21 14:11:05.639284 containerd[1478]: time="2025-03-21T14:11:05.639225115Z" level=info msg="Container b9d64da730b1673544d1383872b0db26465b24528071fe19de2904aae5a3b81a: CDI devices from CRI Config.CDIDevices: []" Mar 21 14:11:05.658233 containerd[1478]: time="2025-03-21T14:11:05.658171243Z" level=info msg="CreateContainer within sandbox \"14dcb7ffa7474b4c28bd3587bb784a8405edb2914b99f30742c9abcaa0d26e13\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"b9d64da730b1673544d1383872b0db26465b24528071fe19de2904aae5a3b81a\"" Mar 21 14:11:05.660199 containerd[1478]: time="2025-03-21T14:11:05.660048783Z" level=info msg="StartContainer for \"b9d64da730b1673544d1383872b0db26465b24528071fe19de2904aae5a3b81a\"" Mar 21 14:11:05.661797 containerd[1478]: time="2025-03-21T14:11:05.661650798Z" level=info msg="connecting to shim b9d64da730b1673544d1383872b0db26465b24528071fe19de2904aae5a3b81a" address="unix:///run/containerd/s/20a656d27e3ab959dba4ee84fdf8fe07aa9f236578b67de912c4ebfaf89ab72b" protocol=ttrpc version=3 Mar 21 14:11:05.693301 systemd[1]: Started cri-containerd-b9d64da730b1673544d1383872b0db26465b24528071fe19de2904aae5a3b81a.scope - libcontainer container b9d64da730b1673544d1383872b0db26465b24528071fe19de2904aae5a3b81a. Mar 21 14:11:05.751921 containerd[1478]: time="2025-03-21T14:11:05.751815398Z" level=info msg="StartContainer for \"b9d64da730b1673544d1383872b0db26465b24528071fe19de2904aae5a3b81a\" returns successfully" Mar 21 14:11:05.829136 kubelet[2670]: I0321 14:11:05.828169 2670 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-hjpgs" podStartSLOduration=29.059097986 podStartE2EDuration="43.828153267s" podCreationTimestamp="2025-03-21 14:10:22 +0000 UTC" firstStartedPulling="2025-03-21 14:10:50.848627495 +0000 UTC m=+45.462802582" lastFinishedPulling="2025-03-21 14:11:05.617682786 +0000 UTC m=+60.231857863" observedRunningTime="2025-03-21 14:11:05.826284383 +0000 UTC m=+60.440459460" watchObservedRunningTime="2025-03-21 14:11:05.828153267 +0000 UTC m=+60.442328344" Mar 21 14:11:06.597095 kubelet[2670]: I0321 14:11:06.596486 2670 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 21 14:11:06.597095 kubelet[2670]: I0321 14:11:06.596546 2670 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 21 14:11:08.398336 containerd[1478]: time="2025-03-21T14:11:08.398240854Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"c2b6fe74c7ebb3181fa7529e06f2d7357787bd3e8ffaf37ada06eefe9ce665c9\" pid:4731 exited_at:{seconds:1742566268 nanos:396990132}" Mar 21 14:11:08.896903 kubelet[2670]: I0321 14:11:08.896713 2670 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 14:11:12.548505 kubelet[2670]: I0321 14:11:12.548435 2670 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 14:11:21.373076 containerd[1478]: time="2025-03-21T14:11:21.373027459Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" id:\"fdb9c7f68dbe61ec98c645d50b5cf3dcb06cbadcff9c1b001e5dfc4713b82b08\" pid:4771 exited_at:{seconds:1742566281 nanos:372680177}" Mar 21 14:11:38.398031 containerd[1478]: time="2025-03-21T14:11:38.397946886Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"76d45436d19bd4a9453855a2c0850aa0651a0082dfe61012fb92166ffba5dfa8\" pid:4807 exited_at:{seconds:1742566298 nanos:397516316}" Mar 21 14:11:51.366215 containerd[1478]: time="2025-03-21T14:11:51.366088412Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" id:\"03647dca29ec2698cfd78f2cf52a97d39e5a60338dc79e51970e037187104b7d\" pid:4833 exited_at:{seconds:1742566311 nanos:365482464}" Mar 21 14:12:05.490292 containerd[1478]: time="2025-03-21T14:12:05.490176373Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"a729404fbd2a645f72ef88a2bbf10fdab1f3274a2945bf899612fe1453d12710\" pid:4859 exited_at:{seconds:1742566325 nanos:489665990}" Mar 21 14:12:08.399886 containerd[1478]: time="2025-03-21T14:12:08.399847303Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"6da93effb17efb70cc08416ef65c17fe1d7e9ce364141fa29ac35c32e3caa465\" pid:4882 exited_at:{seconds:1742566328 nanos:399423873}" Mar 21 14:12:16.961163 systemd[1]: Started sshd@7-172.24.4.61:22-172.24.4.1:38876.service - OpenSSH per-connection server daemon (172.24.4.1:38876). Mar 21 14:12:18.274107 sshd[4905]: Accepted publickey for core from 172.24.4.1 port 38876 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:18.277712 sshd-session[4905]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:18.291907 systemd-logind[1456]: New session 10 of user core. Mar 21 14:12:18.299426 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 21 14:12:19.049895 sshd[4907]: Connection closed by 172.24.4.1 port 38876 Mar 21 14:12:19.050977 sshd-session[4905]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:19.059957 systemd[1]: sshd@7-172.24.4.61:22-172.24.4.1:38876.service: Deactivated successfully. Mar 21 14:12:19.064433 systemd[1]: session-10.scope: Deactivated successfully. Mar 21 14:12:19.067354 systemd-logind[1456]: Session 10 logged out. Waiting for processes to exit. Mar 21 14:12:19.069893 systemd-logind[1456]: Removed session 10. Mar 21 14:12:21.366092 containerd[1478]: time="2025-03-21T14:12:21.366044632Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" id:\"8dad17a681d1441c47d282017ad32d5e8a842accf3520d9d1802414b67558f5d\" pid:4940 exited_at:{seconds:1742566341 nanos:365774407}" Mar 21 14:12:24.081094 systemd[1]: Started sshd@8-172.24.4.61:22-172.24.4.1:60684.service - OpenSSH per-connection server daemon (172.24.4.1:60684). Mar 21 14:12:25.259324 sshd[4963]: Accepted publickey for core from 172.24.4.1 port 60684 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:25.262043 sshd-session[4963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:25.272641 systemd-logind[1456]: New session 11 of user core. Mar 21 14:12:25.284493 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 21 14:12:25.989319 sshd[4965]: Connection closed by 172.24.4.1 port 60684 Mar 21 14:12:25.990386 sshd-session[4963]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:25.997948 systemd-logind[1456]: Session 11 logged out. Waiting for processes to exit. Mar 21 14:12:25.999444 systemd[1]: sshd@8-172.24.4.61:22-172.24.4.1:60684.service: Deactivated successfully. Mar 21 14:12:26.004003 systemd[1]: session-11.scope: Deactivated successfully. Mar 21 14:12:26.007640 systemd-logind[1456]: Removed session 11. Mar 21 14:12:31.015888 systemd[1]: Started sshd@9-172.24.4.61:22-172.24.4.1:60698.service - OpenSSH per-connection server daemon (172.24.4.1:60698). Mar 21 14:12:32.485367 sshd[4978]: Accepted publickey for core from 172.24.4.1 port 60698 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:32.487940 sshd-session[4978]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:32.538066 systemd-logind[1456]: New session 12 of user core. Mar 21 14:12:32.546448 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 21 14:12:33.279075 sshd[4980]: Connection closed by 172.24.4.1 port 60698 Mar 21 14:12:33.280295 sshd-session[4978]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:33.294668 systemd[1]: sshd@9-172.24.4.61:22-172.24.4.1:60698.service: Deactivated successfully. Mar 21 14:12:33.300524 systemd[1]: session-12.scope: Deactivated successfully. Mar 21 14:12:33.302622 systemd-logind[1456]: Session 12 logged out. Waiting for processes to exit. Mar 21 14:12:33.308023 systemd[1]: Started sshd@10-172.24.4.61:22-172.24.4.1:60712.service - OpenSSH per-connection server daemon (172.24.4.1:60712). Mar 21 14:12:33.311816 systemd-logind[1456]: Removed session 12. Mar 21 14:12:34.560566 sshd[4991]: Accepted publickey for core from 172.24.4.1 port 60712 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:34.563273 sshd-session[4991]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:34.574798 systemd-logind[1456]: New session 13 of user core. Mar 21 14:12:34.583430 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 21 14:12:35.417167 sshd[4994]: Connection closed by 172.24.4.1 port 60712 Mar 21 14:12:35.420081 sshd-session[4991]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:35.431195 systemd[1]: sshd@10-172.24.4.61:22-172.24.4.1:60712.service: Deactivated successfully. Mar 21 14:12:35.434988 systemd[1]: session-13.scope: Deactivated successfully. Mar 21 14:12:35.439712 systemd-logind[1456]: Session 13 logged out. Waiting for processes to exit. Mar 21 14:12:35.443780 systemd[1]: Started sshd@11-172.24.4.61:22-172.24.4.1:52700.service - OpenSSH per-connection server daemon (172.24.4.1:52700). Mar 21 14:12:35.448061 systemd-logind[1456]: Removed session 13. Mar 21 14:12:36.867003 sshd[5004]: Accepted publickey for core from 172.24.4.1 port 52700 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:36.869981 sshd-session[5004]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:36.883266 systemd-logind[1456]: New session 14 of user core. Mar 21 14:12:36.888474 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 21 14:12:37.659173 sshd[5007]: Connection closed by 172.24.4.1 port 52700 Mar 21 14:12:37.657994 sshd-session[5004]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:37.664371 systemd[1]: sshd@11-172.24.4.61:22-172.24.4.1:52700.service: Deactivated successfully. Mar 21 14:12:37.670423 systemd[1]: session-14.scope: Deactivated successfully. Mar 21 14:12:37.674899 systemd-logind[1456]: Session 14 logged out. Waiting for processes to exit. Mar 21 14:12:37.677700 systemd-logind[1456]: Removed session 14. Mar 21 14:12:38.395092 containerd[1478]: time="2025-03-21T14:12:38.394994689Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"5476b5fd9aa3abd41fac1f8a271655e19d59e9357eee52bd0e5fe5813d6c1255\" pid:5031 exited_at:{seconds:1742566358 nanos:394184937}" Mar 21 14:12:42.678165 systemd[1]: Started sshd@12-172.24.4.61:22-172.24.4.1:52702.service - OpenSSH per-connection server daemon (172.24.4.1:52702). Mar 21 14:12:44.245193 sshd[5047]: Accepted publickey for core from 172.24.4.1 port 52702 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:44.248882 sshd-session[5047]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:44.259887 systemd-logind[1456]: New session 15 of user core. Mar 21 14:12:44.269219 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 21 14:12:44.968924 sshd[5049]: Connection closed by 172.24.4.1 port 52702 Mar 21 14:12:44.968819 sshd-session[5047]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:44.973399 systemd[1]: sshd@12-172.24.4.61:22-172.24.4.1:52702.service: Deactivated successfully. Mar 21 14:12:44.977385 systemd[1]: session-15.scope: Deactivated successfully. Mar 21 14:12:44.981106 systemd-logind[1456]: Session 15 logged out. Waiting for processes to exit. Mar 21 14:12:44.982424 systemd-logind[1456]: Removed session 15. Mar 21 14:12:49.992481 systemd[1]: Started sshd@13-172.24.4.61:22-172.24.4.1:37342.service - OpenSSH per-connection server daemon (172.24.4.1:37342). Mar 21 14:12:51.117420 sshd[5061]: Accepted publickey for core from 172.24.4.1 port 37342 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:51.121487 sshd-session[5061]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:51.134047 systemd-logind[1456]: New session 16 of user core. Mar 21 14:12:51.142458 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 21 14:12:51.364867 containerd[1478]: time="2025-03-21T14:12:51.364797092Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" id:\"e65ac957199c1fafb9664e3be18afd839fe4c5d07015d4c0e2ca1f6ce3aff537\" pid:5078 exited_at:{seconds:1742566371 nanos:364271300}" Mar 21 14:12:51.935008 sshd[5064]: Connection closed by 172.24.4.1 port 37342 Mar 21 14:12:51.936015 sshd-session[5061]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:51.944224 systemd[1]: sshd@13-172.24.4.61:22-172.24.4.1:37342.service: Deactivated successfully. Mar 21 14:12:51.946226 systemd[1]: session-16.scope: Deactivated successfully. Mar 21 14:12:51.947871 systemd-logind[1456]: Session 16 logged out. Waiting for processes to exit. Mar 21 14:12:51.949861 systemd-logind[1456]: Removed session 16. Mar 21 14:12:56.959511 systemd[1]: Started sshd@14-172.24.4.61:22-172.24.4.1:59730.service - OpenSSH per-connection server daemon (172.24.4.1:59730). Mar 21 14:12:57.976163 sshd[5101]: Accepted publickey for core from 172.24.4.1 port 59730 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:12:57.978725 sshd-session[5101]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:12:57.992051 systemd-logind[1456]: New session 17 of user core. Mar 21 14:12:57.996446 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 21 14:12:58.605423 sshd[5103]: Connection closed by 172.24.4.1 port 59730 Mar 21 14:12:58.608316 sshd-session[5101]: pam_unix(sshd:session): session closed for user core Mar 21 14:12:58.621798 systemd[1]: sshd@14-172.24.4.61:22-172.24.4.1:59730.service: Deactivated successfully. Mar 21 14:12:58.625905 systemd[1]: session-17.scope: Deactivated successfully. Mar 21 14:12:58.629473 systemd-logind[1456]: Session 17 logged out. Waiting for processes to exit. Mar 21 14:12:58.633724 systemd[1]: Started sshd@15-172.24.4.61:22-172.24.4.1:59742.service - OpenSSH per-connection server daemon (172.24.4.1:59742). Mar 21 14:12:58.637404 systemd-logind[1456]: Removed session 17. Mar 21 14:13:00.001724 sshd[5114]: Accepted publickey for core from 172.24.4.1 port 59742 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:00.004599 sshd-session[5114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:00.017231 systemd-logind[1456]: New session 18 of user core. Mar 21 14:13:00.026646 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 21 14:13:01.300793 sshd[5117]: Connection closed by 172.24.4.1 port 59742 Mar 21 14:13:01.301285 sshd-session[5114]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:01.320772 systemd[1]: sshd@15-172.24.4.61:22-172.24.4.1:59742.service: Deactivated successfully. Mar 21 14:13:01.326803 systemd[1]: session-18.scope: Deactivated successfully. Mar 21 14:13:01.330069 systemd-logind[1456]: Session 18 logged out. Waiting for processes to exit. Mar 21 14:13:01.338042 systemd[1]: Started sshd@16-172.24.4.61:22-172.24.4.1:59748.service - OpenSSH per-connection server daemon (172.24.4.1:59748). Mar 21 14:13:01.343606 systemd-logind[1456]: Removed session 18. Mar 21 14:13:02.475100 sshd[5126]: Accepted publickey for core from 172.24.4.1 port 59748 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:02.478032 sshd-session[5126]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:02.491831 systemd-logind[1456]: New session 19 of user core. Mar 21 14:13:02.494819 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 21 14:13:05.425529 containerd[1478]: time="2025-03-21T14:13:05.425490411Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"d902cd0cdee2e12e3d2a4ea05c0a77789f5ad3f5b8e5f7460c28d0d99b180f26\" pid:5154 exited_at:{seconds:1742566385 nanos:425257651}" Mar 21 14:13:05.465468 sshd[5129]: Connection closed by 172.24.4.1 port 59748 Mar 21 14:13:05.466573 sshd-session[5126]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:05.485382 systemd[1]: sshd@16-172.24.4.61:22-172.24.4.1:59748.service: Deactivated successfully. Mar 21 14:13:05.489652 systemd[1]: session-19.scope: Deactivated successfully. Mar 21 14:13:05.490337 systemd[1]: session-19.scope: Consumed 882ms CPU time, 71M memory peak. Mar 21 14:13:05.491550 systemd-logind[1456]: Session 19 logged out. Waiting for processes to exit. Mar 21 14:13:05.494824 systemd[1]: Started sshd@17-172.24.4.61:22-172.24.4.1:50814.service - OpenSSH per-connection server daemon (172.24.4.1:50814). Mar 21 14:13:05.497422 systemd-logind[1456]: Removed session 19. Mar 21 14:13:07.016752 sshd[5168]: Accepted publickey for core from 172.24.4.1 port 50814 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:07.020305 sshd-session[5168]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:07.033989 systemd-logind[1456]: New session 20 of user core. Mar 21 14:13:07.042527 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 21 14:13:08.298526 sshd[5171]: Connection closed by 172.24.4.1 port 50814 Mar 21 14:13:08.299694 sshd-session[5168]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:08.319951 systemd[1]: sshd@17-172.24.4.61:22-172.24.4.1:50814.service: Deactivated successfully. Mar 21 14:13:08.324980 systemd[1]: session-20.scope: Deactivated successfully. Mar 21 14:13:08.328879 systemd-logind[1456]: Session 20 logged out. Waiting for processes to exit. Mar 21 14:13:08.334728 systemd[1]: Started sshd@18-172.24.4.61:22-172.24.4.1:50828.service - OpenSSH per-connection server daemon (172.24.4.1:50828). Mar 21 14:13:08.340857 systemd-logind[1456]: Removed session 20. Mar 21 14:13:08.398597 containerd[1478]: time="2025-03-21T14:13:08.398557720Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"5bfcc0ea54e8e4f5f6dbbd8fd461354651143c3eb3320f3f3ce4b594f343774f\" pid:5192 exited_at:{seconds:1742566388 nanos:398300856}" Mar 21 14:13:09.474472 sshd[5184]: Accepted publickey for core from 172.24.4.1 port 50828 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:09.477865 sshd-session[5184]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:09.496297 systemd-logind[1456]: New session 21 of user core. Mar 21 14:13:09.507477 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 21 14:13:10.136391 sshd[5202]: Connection closed by 172.24.4.1 port 50828 Mar 21 14:13:10.137644 sshd-session[5184]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:10.146945 systemd[1]: sshd@18-172.24.4.61:22-172.24.4.1:50828.service: Deactivated successfully. Mar 21 14:13:10.156624 systemd[1]: session-21.scope: Deactivated successfully. Mar 21 14:13:10.160099 systemd-logind[1456]: Session 21 logged out. Waiting for processes to exit. Mar 21 14:13:10.163677 systemd-logind[1456]: Removed session 21. Mar 21 14:13:15.158757 systemd[1]: Started sshd@19-172.24.4.61:22-172.24.4.1:39520.service - OpenSSH per-connection server daemon (172.24.4.1:39520). Mar 21 14:13:16.472842 sshd[5219]: Accepted publickey for core from 172.24.4.1 port 39520 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:16.474457 sshd-session[5219]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:16.481365 systemd-logind[1456]: New session 22 of user core. Mar 21 14:13:16.488005 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 21 14:13:17.083309 sshd[5221]: Connection closed by 172.24.4.1 port 39520 Mar 21 14:13:17.085319 sshd-session[5219]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:17.092454 systemd[1]: sshd@19-172.24.4.61:22-172.24.4.1:39520.service: Deactivated successfully. Mar 21 14:13:17.097692 systemd[1]: session-22.scope: Deactivated successfully. Mar 21 14:13:17.099943 systemd-logind[1456]: Session 22 logged out. Waiting for processes to exit. Mar 21 14:13:17.102368 systemd-logind[1456]: Removed session 22. Mar 21 14:13:21.363679 containerd[1478]: time="2025-03-21T14:13:21.363641317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab7305f631276ca9e5b88c5ff2cf46bb6f6b1e67999517333e120229865f169\" id:\"ecfb8ef04b5cc0473071da190843c7ca80069efa80b0bbfba652f355089cadfd\" pid:5244 exited_at:{seconds:1742566401 nanos:363343546}" Mar 21 14:13:22.104389 systemd[1]: Started sshd@20-172.24.4.61:22-172.24.4.1:39530.service - OpenSSH per-connection server daemon (172.24.4.1:39530). Mar 21 14:13:23.277182 sshd[5257]: Accepted publickey for core from 172.24.4.1 port 39530 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:23.279971 sshd-session[5257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:23.293228 systemd-logind[1456]: New session 23 of user core. Mar 21 14:13:23.300455 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 21 14:13:24.012269 sshd[5259]: Connection closed by 172.24.4.1 port 39530 Mar 21 14:13:24.013315 sshd-session[5257]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:24.020935 systemd[1]: sshd@20-172.24.4.61:22-172.24.4.1:39530.service: Deactivated successfully. Mar 21 14:13:24.025183 systemd[1]: session-23.scope: Deactivated successfully. Mar 21 14:13:24.027300 systemd-logind[1456]: Session 23 logged out. Waiting for processes to exit. Mar 21 14:13:24.029964 systemd-logind[1456]: Removed session 23. Mar 21 14:13:29.038826 systemd[1]: Started sshd@21-172.24.4.61:22-172.24.4.1:48034.service - OpenSSH per-connection server daemon (172.24.4.1:48034). Mar 21 14:13:30.127728 sshd[5272]: Accepted publickey for core from 172.24.4.1 port 48034 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:30.130521 sshd-session[5272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:30.141640 systemd-logind[1456]: New session 24 of user core. Mar 21 14:13:30.155557 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 21 14:13:30.829952 sshd[5280]: Connection closed by 172.24.4.1 port 48034 Mar 21 14:13:30.831251 sshd-session[5272]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:30.837299 systemd-logind[1456]: Session 24 logged out. Waiting for processes to exit. Mar 21 14:13:30.838549 systemd[1]: sshd@21-172.24.4.61:22-172.24.4.1:48034.service: Deactivated successfully. Mar 21 14:13:30.842674 systemd[1]: session-24.scope: Deactivated successfully. Mar 21 14:13:30.847872 systemd-logind[1456]: Removed session 24. Mar 21 14:13:35.855183 systemd[1]: Started sshd@22-172.24.4.61:22-172.24.4.1:33724.service - OpenSSH per-connection server daemon (172.24.4.1:33724). Mar 21 14:13:37.029348 sshd[5294]: Accepted publickey for core from 172.24.4.1 port 33724 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:37.032103 sshd-session[5294]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:37.046236 systemd-logind[1456]: New session 25 of user core. Mar 21 14:13:37.053612 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 21 14:13:37.873546 sshd[5296]: Connection closed by 172.24.4.1 port 33724 Mar 21 14:13:37.874772 sshd-session[5294]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:37.880652 systemd[1]: sshd@22-172.24.4.61:22-172.24.4.1:33724.service: Deactivated successfully. Mar 21 14:13:37.888797 systemd[1]: session-25.scope: Deactivated successfully. Mar 21 14:13:37.892594 systemd-logind[1456]: Session 25 logged out. Waiting for processes to exit. Mar 21 14:13:37.896349 systemd-logind[1456]: Removed session 25. Mar 21 14:13:38.354388 containerd[1478]: time="2025-03-21T14:13:38.354063541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff16f42e5cb1358c63aba8be1f1e68589059ee5ba2656505dd5a2dea7476ae93\" id:\"1bcde97597b0681cbf6ea6e2766b52601af23fd2b3d3509e1d74686a80facd5f\" pid:5319 exited_at:{seconds:1742566418 nanos:353435663}" Mar 21 14:13:42.894600 systemd[1]: Started sshd@23-172.24.4.61:22-172.24.4.1:33728.service - OpenSSH per-connection server daemon (172.24.4.1:33728). Mar 21 14:13:44.104434 sshd[5330]: Accepted publickey for core from 172.24.4.1 port 33728 ssh2: RSA SHA256:nrSHR5aYDV0jboSiMRCWxtX+cv5hyK+7zdlbhvaiSXk Mar 21 14:13:44.107354 sshd-session[5330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 14:13:44.118537 systemd-logind[1456]: New session 26 of user core. Mar 21 14:13:44.127451 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 21 14:13:44.939453 sshd[5332]: Connection closed by 172.24.4.1 port 33728 Mar 21 14:13:44.941503 sshd-session[5330]: pam_unix(sshd:session): session closed for user core Mar 21 14:13:44.949526 systemd[1]: sshd@23-172.24.4.61:22-172.24.4.1:33728.service: Deactivated successfully. Mar 21 14:13:44.954881 systemd[1]: session-26.scope: Deactivated successfully. Mar 21 14:13:44.957331 systemd-logind[1456]: Session 26 logged out. Waiting for processes to exit. Mar 21 14:13:44.960383 systemd-logind[1456]: Removed session 26.